Dec 01 10:31:21 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 10:31:21 crc restorecon[4744]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:21 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 10:31:22 crc restorecon[4744]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 10:31:23 crc kubenswrapper[4909]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.029952 4909 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.032992 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033014 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033018 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033022 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033026 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033030 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033034 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033038 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033041 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033054 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033058 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033062 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033067 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033071 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033075 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033079 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033082 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033086 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033089 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033093 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033096 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033100 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033104 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033107 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033111 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033114 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033118 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033121 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033124 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033129 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033133 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033137 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033142 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033147 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033151 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033157 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033161 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033168 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033172 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033183 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033187 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033191 4909 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033197 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033200 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033204 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033207 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033210 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033214 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033218 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033221 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033224 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033228 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033231 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033235 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033238 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033241 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033245 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033248 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033252 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033255 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033258 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033262 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033265 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033270 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033275 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033279 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033283 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033287 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033291 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033294 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.033297 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033658 4909 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033670 4909 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033677 4909 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033685 4909 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033690 4909 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033695 4909 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033702 4909 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033707 4909 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033711 4909 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033716 4909 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033721 4909 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033725 4909 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033729 4909 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033734 4909 flags.go:64] FLAG: --cgroup-root="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033738 4909 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033742 4909 flags.go:64] FLAG: --client-ca-file="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033746 4909 flags.go:64] FLAG: --cloud-config="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033751 4909 flags.go:64] FLAG: --cloud-provider="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033755 4909 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033760 4909 flags.go:64] FLAG: --cluster-domain="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033764 4909 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033769 4909 flags.go:64] FLAG: --config-dir="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033773 4909 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033778 4909 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033783 4909 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033787 4909 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033792 4909 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033796 4909 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033803 4909 flags.go:64] FLAG: --contention-profiling="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033807 4909 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033811 4909 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033816 4909 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033820 4909 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033825 4909 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033830 4909 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033834 4909 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033838 4909 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033842 4909 flags.go:64] FLAG: --enable-server="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033846 4909 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033851 4909 flags.go:64] FLAG: --event-burst="100" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033856 4909 flags.go:64] FLAG: --event-qps="50" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033861 4909 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033866 4909 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033885 4909 flags.go:64] FLAG: --eviction-hard="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033890 4909 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033894 4909 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033898 4909 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033903 4909 flags.go:64] FLAG: --eviction-soft="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033907 4909 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033911 4909 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033915 4909 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033919 4909 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033923 4909 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033927 4909 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033931 4909 flags.go:64] FLAG: --feature-gates="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033936 4909 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033941 4909 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033946 4909 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033951 4909 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033957 4909 flags.go:64] FLAG: --healthz-port="10248" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033966 4909 flags.go:64] FLAG: --help="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033971 4909 flags.go:64] FLAG: --hostname-override="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033975 4909 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033979 4909 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033983 4909 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033988 4909 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033991 4909 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033995 4909 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.033999 4909 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034003 4909 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034007 4909 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034011 4909 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034016 4909 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034020 4909 flags.go:64] FLAG: --kube-reserved="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034024 4909 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034027 4909 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034032 4909 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034036 4909 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034040 4909 flags.go:64] FLAG: --lock-file="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034044 4909 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034048 4909 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034052 4909 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034059 4909 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034063 4909 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034067 4909 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034071 4909 flags.go:64] FLAG: --logging-format="text" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034075 4909 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034079 4909 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034083 4909 flags.go:64] FLAG: --manifest-url="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034088 4909 flags.go:64] FLAG: --manifest-url-header="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034095 4909 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034100 4909 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034109 4909 flags.go:64] FLAG: --max-pods="110" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034114 4909 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034120 4909 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034125 4909 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034129 4909 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034134 4909 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034138 4909 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034142 4909 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034152 4909 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034157 4909 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034161 4909 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034166 4909 flags.go:64] FLAG: --pod-cidr="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034170 4909 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034176 4909 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034181 4909 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034185 4909 flags.go:64] FLAG: --pods-per-core="0" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034189 4909 flags.go:64] FLAG: --port="10250" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034194 4909 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034198 4909 flags.go:64] FLAG: --provider-id="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034202 4909 flags.go:64] FLAG: --qos-reserved="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034206 4909 flags.go:64] FLAG: --read-only-port="10255" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034211 4909 flags.go:64] FLAG: --register-node="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034215 4909 flags.go:64] FLAG: --register-schedulable="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034224 4909 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034231 4909 flags.go:64] FLAG: --registry-burst="10" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034236 4909 flags.go:64] FLAG: --registry-qps="5" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034240 4909 flags.go:64] FLAG: --reserved-cpus="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034243 4909 flags.go:64] FLAG: --reserved-memory="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034248 4909 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034252 4909 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034257 4909 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034261 4909 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034266 4909 flags.go:64] FLAG: --runonce="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034270 4909 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034274 4909 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034278 4909 flags.go:64] FLAG: --seccomp-default="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034282 4909 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034286 4909 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034291 4909 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034295 4909 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034299 4909 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034303 4909 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034307 4909 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034311 4909 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034315 4909 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034320 4909 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034324 4909 flags.go:64] FLAG: --system-cgroups="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034328 4909 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034334 4909 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034338 4909 flags.go:64] FLAG: --tls-cert-file="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034341 4909 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034346 4909 flags.go:64] FLAG: --tls-min-version="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034350 4909 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034354 4909 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034359 4909 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034364 4909 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034368 4909 flags.go:64] FLAG: --v="2" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034373 4909 flags.go:64] FLAG: --version="false" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034379 4909 flags.go:64] FLAG: --vmodule="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034384 4909 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034388 4909 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034503 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034507 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034511 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034514 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034519 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034524 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034528 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034532 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034537 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034541 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034544 4909 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034548 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034552 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034555 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034559 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034563 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034567 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034570 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034573 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034577 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034580 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034584 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034587 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034591 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034594 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034598 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034603 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034607 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034610 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034614 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034617 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034620 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034624 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034627 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034631 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034634 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034638 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034641 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034644 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034648 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034653 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034657 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034661 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034665 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034671 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034676 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034681 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034687 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034690 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034694 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034702 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034705 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034709 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034712 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034716 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034719 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034723 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034726 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034732 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034735 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034739 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034742 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034746 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034750 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034753 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034756 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034760 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034763 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034767 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034771 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.034775 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.034787 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.049311 4909 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.049409 4909 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049560 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049576 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049587 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049596 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049606 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049616 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049625 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049635 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049644 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049653 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049662 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049671 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049679 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049687 4909 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049696 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049704 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049715 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049728 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049738 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049748 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049757 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049765 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049773 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049781 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049789 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049797 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049804 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049814 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049822 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049830 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049838 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049847 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049856 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049865 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049899 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049911 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049923 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049935 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049945 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049955 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049965 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049975 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049986 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.049996 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050005 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050015 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050025 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050034 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050045 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050057 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050070 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050084 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050094 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050106 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050117 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050127 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050138 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050149 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050160 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050172 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050183 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050197 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050210 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050221 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050230 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050240 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050254 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050265 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050278 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050287 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050297 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.050314 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050628 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050654 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050663 4909 feature_gate.go:330] unrecognized feature gate: Example Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050673 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050681 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050690 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050701 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050713 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050721 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050730 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050739 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050749 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050757 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050768 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050776 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050784 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050792 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050802 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050813 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050823 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050833 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050842 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050852 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050927 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050939 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050947 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050955 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050964 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050973 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050981 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050989 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.050997 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051005 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051013 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051024 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051035 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051043 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051051 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051059 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051068 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051076 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051084 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051092 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051100 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051109 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051117 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051124 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051135 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051147 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051157 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051167 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051176 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051184 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051193 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051203 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051213 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051222 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051230 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051239 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051247 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051256 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051264 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051272 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051280 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051291 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051299 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051307 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051315 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051323 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051331 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.051339 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.051354 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.052033 4909 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.057855 4909 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.058065 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.058973 4909 server.go:997] "Starting client certificate rotation" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.059034 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.059241 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-17 01:55:21.078485177 +0000 UTC Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.059363 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.066502 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.068827 4909 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.069324 4909 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.079783 4909 log.go:25] "Validated CRI v1 runtime API" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.111696 4909 log.go:25] "Validated CRI v1 image API" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.114980 4909 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.119537 4909 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-10-26-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.119598 4909 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.153689 4909 manager.go:217] Machine: {Timestamp:2025-12-01 10:31:23.15130444 +0000 UTC m=+0.385775418 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b132f599-ba64-4f09-b8b2-2af8c2f13405 BootID:578ee329-32ca-4325-930b-3c9b1b6c332b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d9:75:4e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d9:75:4e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c6:62:53 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f8:fc:d3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:23:cc:bb Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6c:cc:4a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ba:cb:cd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:5a:4c:4f:1c:b4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:6c:42:bc:6b:30 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.154195 4909 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.154459 4909 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.157130 4909 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.157483 4909 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.157540 4909 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.157943 4909 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.157963 4909 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.158268 4909 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.158324 4909 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.158780 4909 state_mem.go:36] "Initialized new in-memory state store" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.159014 4909 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.160339 4909 kubelet.go:418] "Attempting to sync node with API server" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.160395 4909 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.160442 4909 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.160474 4909 kubelet.go:324] "Adding apiserver pod source" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.160503 4909 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.162744 4909 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.163197 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.164947 4909 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165503 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165532 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165549 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165558 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165575 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165594 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165604 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165618 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165628 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165640 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165673 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165683 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.165904 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.166497 4909 server.go:1280] "Started kubelet" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.166790 4909 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.166794 4909 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.166969 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.167107 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.167286 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.167984 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.168001 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.167341 4909 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 10:31:23 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.169808 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.169818 4909 server.go:460] "Adding debug handlers to kubelet server" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.169840 4909 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.170557 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.170592 4909 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.171249 4909 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.170603 4909 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.170910 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 09:11:10.870830284 +0000 UTC Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.171015 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.171411 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="200ms" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.171422 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.172481 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.72:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d10c8b216f2ed default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:31:23.166454509 +0000 UTC m=+0.400925417,LastTimestamp:2025-12-01 10:31:23.166454509 +0000 UTC m=+0.400925417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.177067 4909 factory.go:55] Registering systemd factory Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.177140 4909 factory.go:221] Registration of the systemd container factory successfully Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.177719 4909 factory.go:153] Registering CRI-O factory Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.177758 4909 factory.go:221] Registration of the crio container factory successfully Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.177966 4909 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.178052 4909 factory.go:103] Registering Raw factory Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.178092 4909 manager.go:1196] Started watching for new ooms in manager Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.179688 4909 manager.go:319] Starting recovery of all containers Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185729 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185781 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185797 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185812 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185825 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185841 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185855 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185887 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185907 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185921 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185936 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185950 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185964 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185980 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.185995 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186010 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186022 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186033 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186045 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186057 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186070 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186086 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186101 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186115 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186130 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186142 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186158 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186173 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186189 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186202 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186248 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186290 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186311 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186344 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186361 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186375 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186389 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186402 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186417 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186432 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186446 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186458 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186473 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186489 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186540 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186561 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186579 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186595 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186608 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186626 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186641 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186658 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186681 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186699 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186714 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186730 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186745 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186759 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186773 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186813 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186830 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186844 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186861 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186896 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186908 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186922 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186936 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186948 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186963 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186978 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.186992 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187005 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187019 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187031 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187045 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187062 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187075 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187090 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187104 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187122 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187139 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187153 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187261 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187275 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187288 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187300 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187315 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187330 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187343 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187357 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187370 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187384 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187397 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187411 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187424 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187440 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187452 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187465 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187480 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187495 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187508 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.187522 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189308 4909 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189342 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189359 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189381 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189397 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189413 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189428 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189443 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189457 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189475 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189490 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189508 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189523 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189538 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189552 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189567 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189581 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189593 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189609 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189625 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189639 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189653 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189666 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189680 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189694 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189709 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189723 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189738 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189751 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189765 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189779 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189793 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189806 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189821 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189836 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189852 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189867 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189918 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189932 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189951 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189965 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189980 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.189994 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190008 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190022 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190035 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190049 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190064 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190081 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190096 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190110 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190123 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190138 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190150 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190163 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190178 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190194 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190209 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190223 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190239 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190253 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190265 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190280 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190295 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190307 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190325 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190338 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190350 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190364 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190378 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190391 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190404 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190416 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190431 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190445 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190459 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190471 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190485 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190498 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190510 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190564 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190583 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190595 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190609 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190622 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190635 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190648 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190664 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190676 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190690 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190702 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190715 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190727 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190740 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190754 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190767 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190778 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190807 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190821 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190835 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190847 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190858 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190886 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190900 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190913 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190928 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190940 4909 reconstruct.go:97] "Volume reconstruction finished" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.190950 4909 reconciler.go:26] "Reconciler: start to sync state" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.198539 4909 manager.go:324] Recovery completed Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.213787 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.216162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.216198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.216209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.217527 4909 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.217557 4909 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.217591 4909 state_mem.go:36] "Initialized new in-memory state store" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.253652 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.255765 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.255857 4909 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.255925 4909 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.256032 4909 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.258681 4909 policy_none.go:49] "None policy: Start" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.259135 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.259213 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.259522 4909 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.259552 4909 state_mem.go:35] "Initializing new in-memory state store" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.271507 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.325138 4909 manager.go:334] "Starting Device Plugin manager" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.325220 4909 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.325234 4909 server.go:79] "Starting device plugin registration server" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.325674 4909 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.325692 4909 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.326520 4909 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.326602 4909 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.326610 4909 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.338229 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.356708 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.356817 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.357711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.357738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.357763 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.357899 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358123 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358154 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.358911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359049 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359099 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359136 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.359869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360020 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360100 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360144 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360861 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360947 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.360997 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.361900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.362129 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.362176 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.362925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.362950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.362958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.372444 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="400ms" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393732 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393808 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393917 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393970 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393983 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.393998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394028 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394041 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.394169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.425951 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.427298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.427331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.427340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.427383 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.427977 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.72:6443: connect: connection refused" node="crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494686 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494809 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494825 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494827 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494860 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494916 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494941 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494957 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495006 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494954 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.494986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495066 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495097 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495139 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495217 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.495311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.628262 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.629834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.629964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.629990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.630041 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.630670 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.72:6443: connect: connection refused" node="crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.700459 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.721582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.729445 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ac7472ffbde30f065504e35d9fc81a05066a2bbbb17a79919a7e4bbb4ef81c44 WatchSource:0}: Error finding container ac7472ffbde30f065504e35d9fc81a05066a2bbbb17a79919a7e4bbb4ef81c44: Status 404 returned error can't find the container with id ac7472ffbde30f065504e35d9fc81a05066a2bbbb17a79919a7e4bbb4ef81c44 Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.736603 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.744920 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bb91cd71ecc2405067da0ed75aabee011a37d93e7ff835395ad2a586a1ea82fe WatchSource:0}: Error finding container bb91cd71ecc2405067da0ed75aabee011a37d93e7ff835395ad2a586a1ea82fe: Status 404 returned error can't find the container with id bb91cd71ecc2405067da0ed75aabee011a37d93e7ff835395ad2a586a1ea82fe Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.753213 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: I1201 10:31:23.759704 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:23 crc kubenswrapper[4909]: E1201 10:31:23.773452 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="800ms" Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.773866 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-05b856bde1ab6ae5c4d6e388805751bea8b2699325716106e37d2b3c28bf8c97 WatchSource:0}: Error finding container 05b856bde1ab6ae5c4d6e388805751bea8b2699325716106e37d2b3c28bf8c97: Status 404 returned error can't find the container with id 05b856bde1ab6ae5c4d6e388805751bea8b2699325716106e37d2b3c28bf8c97 Dec 01 10:31:23 crc kubenswrapper[4909]: W1201 10:31:23.787270 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a39859a79c3c7191fce42a1e81901a1e147aa5c9b56de7cf6c5967c6c51ec751 WatchSource:0}: Error finding container a39859a79c3c7191fce42a1e81901a1e147aa5c9b56de7cf6c5967c6c51ec751: Status 404 returned error can't find the container with id a39859a79c3c7191fce42a1e81901a1e147aa5c9b56de7cf6c5967c6c51ec751 Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.030820 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: W1201 10:31:24.030928 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.031013 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.032086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.032117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.032130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.032156 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.032588 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.72:6443: connect: connection refused" node="crc" Dec 01 10:31:24 crc kubenswrapper[4909]: W1201 10:31:24.122671 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.123106 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.169046 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.172141 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:32:00.357549007 +0000 UTC Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.263408 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a" exitCode=0 Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.263518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.263645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e8e15923fc5a6216711ef5a62de3fdf54af23af822a7b9bafecf2abe31b86160"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.263784 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.265667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.265702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.265714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.265989 4909 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010" exitCode=0 Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.266043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.266106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bb91cd71ecc2405067da0ed75aabee011a37d93e7ff835395ad2a586a1ea82fe"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.266244 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.268554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.268595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.268614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.272226 4909 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4" exitCode=0 Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.272304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.272374 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ac7472ffbde30f065504e35d9fc81a05066a2bbbb17a79919a7e4bbb4ef81c44"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.272492 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.273453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.273486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.273500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.274247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.274291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a39859a79c3c7191fce42a1e81901a1e147aa5c9b56de7cf6c5967c6c51ec751"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.275896 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a" exitCode=0 Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.275935 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.275954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05b856bde1ab6ae5c4d6e388805751bea8b2699325716106e37d2b3c28bf8c97"} Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.276044 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.276916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.276963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.276975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.282728 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.284054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.284101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.284112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: W1201 10:31:24.539290 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.539436 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.575102 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="1.6s" Dec 01 10:31:24 crc kubenswrapper[4909]: W1201 10:31:24.755022 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.72:6443: connect: connection refused Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.755094 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.833470 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.835419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.835482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.835505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:24 crc kubenswrapper[4909]: I1201 10:31:24.835552 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:24 crc kubenswrapper[4909]: E1201 10:31:24.838256 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.72:6443: connect: connection refused" node="crc" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.130662 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 10:31:25 crc kubenswrapper[4909]: E1201 10:31:25.132085 4909 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.72:6443: connect: connection refused" logger="UnhandledError" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.172904 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 05:39:43.280610655 +0000 UTC Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.280066 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0" exitCode=0 Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.280140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.280295 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.281197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.281220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.281233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.283412 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.283499 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.284293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.284315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.284324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.286365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.286404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.286420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.286512 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.287228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.287253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.287262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.289399 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.289801 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.289830 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.289849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.290181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.290204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.290213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.292921 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.292949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.292961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.292970 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.292979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471"} Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.293053 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.293606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.293629 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.293640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:25 crc kubenswrapper[4909]: I1201 10:31:25.485327 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.173351 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:18:51.899199784 +0000 UTC Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.173400 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 117h47m25.725805533s for next certificate rotation Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.300194 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53" exitCode=0 Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.300303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53"} Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.300437 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.300609 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.300788 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.302258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.302301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.302311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.303165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.438517 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.440155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.440212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.440230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.440318 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.937805 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.949929 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.950113 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.950169 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.951515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.951558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:26 crc kubenswrapper[4909]: I1201 10:31:26.951574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.181591 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1"} Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308834 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24"} Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308783 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1"} Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308940 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094"} Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.308950 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:27 crc kubenswrapper[4909]: I1201 10:31:27.310667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.317613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d"} Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.317789 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.319103 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.319131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.319139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.903839 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.904134 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.905633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.905690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:28 crc kubenswrapper[4909]: I1201 10:31:28.905703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.320276 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.321519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.321558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.321572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.400768 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.400960 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.402018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.402056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.402066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.451557 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 10:31:29 crc kubenswrapper[4909]: I1201 10:31:29.820035 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 10:31:30 crc kubenswrapper[4909]: I1201 10:31:30.322599 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:30 crc kubenswrapper[4909]: I1201 10:31:30.323930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:30 crc kubenswrapper[4909]: I1201 10:31:30.323971 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:30 crc kubenswrapper[4909]: I1201 10:31:30.323984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:32 crc kubenswrapper[4909]: I1201 10:31:32.459518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 10:31:32 crc kubenswrapper[4909]: I1201 10:31:32.459708 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:32 crc kubenswrapper[4909]: I1201 10:31:32.460841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:32 crc kubenswrapper[4909]: I1201 10:31:32.460885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:32 crc kubenswrapper[4909]: I1201 10:31:32.460895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:33 crc kubenswrapper[4909]: E1201 10:31:33.339049 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 10:31:34 crc kubenswrapper[4909]: I1201 10:31:34.920660 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:34 crc kubenswrapper[4909]: I1201 10:31:34.921044 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:34 crc kubenswrapper[4909]: I1201 10:31:34.923364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:34 crc kubenswrapper[4909]: I1201 10:31:34.923416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:34 crc kubenswrapper[4909]: I1201 10:31:34.923428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.167530 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.170708 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.176732 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.334407 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.335639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.335716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.335730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:35 crc kubenswrapper[4909]: I1201 10:31:35.340178 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:36 crc kubenswrapper[4909]: E1201 10:31:36.176593 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.336957 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.337957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.338016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.338033 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:36 crc kubenswrapper[4909]: W1201 10:31:36.411231 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.411325 4909 trace.go:236] Trace[1369237628]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:26.408) (total time: 10002ms): Dec 01 10:31:36 crc kubenswrapper[4909]: Trace[1369237628]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (10:31:36.411) Dec 01 10:31:36 crc kubenswrapper[4909]: Trace[1369237628]: [10.002721304s] [10.002721304s] END Dec 01 10:31:36 crc kubenswrapper[4909]: E1201 10:31:36.411352 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 10:31:36 crc kubenswrapper[4909]: E1201 10:31:36.442015 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.530236 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.530460 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.535354 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.535683 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.957034 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]log ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]etcd ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/priority-and-fairness-filter ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-apiextensions-informers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-apiextensions-controllers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/crd-informer-synced ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-system-namespaces-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 01 10:31:36 crc kubenswrapper[4909]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/bootstrap-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/start-kube-aggregator-informers ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-registration-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-discovery-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]autoregister-completion ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-openapi-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 01 10:31:36 crc kubenswrapper[4909]: livez check failed Dec 01 10:31:36 crc kubenswrapper[4909]: I1201 10:31:36.957365 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.339463 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.340632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.340671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.340681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.921490 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 10:31:37 crc kubenswrapper[4909]: I1201 10:31:37.921633 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:31:39 crc kubenswrapper[4909]: I1201 10:31:39.642521 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:39 crc kubenswrapper[4909]: I1201 10:31:39.644793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:39 crc kubenswrapper[4909]: I1201 10:31:39.645012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:39 crc kubenswrapper[4909]: I1201 10:31:39.645150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:39 crc kubenswrapper[4909]: I1201 10:31:39.645309 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:39 crc kubenswrapper[4909]: E1201 10:31:39.651376 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 10:31:40 crc kubenswrapper[4909]: I1201 10:31:40.484349 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.171455 4909 apiserver.go:52] "Watching apiserver" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.182085 4909 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.182494 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.182968 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.183088 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.183218 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.183370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.183426 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.183658 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.183692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.183749 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.183806 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.186931 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.186972 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.187047 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.187161 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.187236 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.187415 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.187960 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.188565 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.189657 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.208868 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.223656 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.234638 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.250058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.263446 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.272593 4909 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.275218 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.287083 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.298809 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.510679 4909 trace.go:236] Trace[1861000660]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:27.091) (total time: 14419ms): Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[1861000660]: ---"Objects listed" error: 14418ms (10:31:41.510) Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[1861000660]: [14.419060755s] [14.419060755s] END Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.510730 4909 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.512285 4909 trace.go:236] Trace[143751135]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:26.712) (total time: 14799ms): Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[143751135]: ---"Objects listed" error: 14799ms (10:31:41.512) Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[143751135]: [14.799563702s] [14.799563702s] END Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.512327 4909 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.513612 4909 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.514775 4909 trace.go:236] Trace[187928043]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 10:31:27.047) (total time: 14466ms): Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[187928043]: ---"Objects listed" error: 14466ms (10:31:41.514) Dec 01 10:31:41 crc kubenswrapper[4909]: Trace[187928043]: [14.466997566s] [14.466997566s] END Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.514805 4909 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.539605 4909 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.548556 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60802->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.548661 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60802->192.168.126.11:17697: read: connection reset by peer" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.560834 4909 csr.go:261] certificate signing request csr-nb27c is approved, waiting to be issued Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.569899 4909 csr.go:257] certificate signing request csr-nb27c is issued Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614213 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614255 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614279 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614303 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614338 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614373 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614407 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614424 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614438 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614459 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614494 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614510 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614526 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614600 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614625 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614725 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614731 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614980 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.614955 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615052 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615057 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615343 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615434 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615449 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615483 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615521 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615572 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615538 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615620 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615594 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615806 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615839 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615841 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615923 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.615983 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616036 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616055 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616113 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616137 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616262 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616263 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616311 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616359 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616379 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616396 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616433 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616473 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616530 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616553 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616584 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616603 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616650 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616685 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616745 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616762 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616599 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616781 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616715 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616758 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616815 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616818 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616835 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616853 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616887 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616909 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616927 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616945 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616965 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617003 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617038 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617055 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617071 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617086 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617129 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617145 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617179 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617262 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617298 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617315 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617332 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617351 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617368 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617408 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617433 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617450 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617469 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617508 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617586 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617605 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617625 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617642 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617706 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617743 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617759 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617775 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617809 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617827 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616862 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617891 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617849 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617076 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617131 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617941 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617167 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617335 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617990 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618016 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618040 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618183 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618208 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618232 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618254 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618278 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618300 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618346 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618370 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618396 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618423 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618471 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618495 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618521 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618594 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618680 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618908 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618926 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618944 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618963 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.618997 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619089 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619113 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619139 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619168 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619282 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619303 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619340 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619367 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619390 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619413 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619440 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619460 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619481 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619499 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619515 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619554 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619694 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619732 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619896 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619923 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619969 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619995 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620050 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620081 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620138 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620255 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620275 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620295 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620313 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620330 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620349 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620400 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620462 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620515 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620606 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620630 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620713 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620818 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620836 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620850 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620865 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620901 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620915 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620930 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620942 4909 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620955 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620969 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620983 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.620997 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621011 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621023 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621037 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621047 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621059 4909 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621069 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621083 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621101 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621116 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621130 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621141 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621150 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621160 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621171 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621186 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621201 4909 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621216 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621229 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621239 4909 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621253 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621266 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621279 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621292 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621305 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621319 4909 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621333 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617333 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617435 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617627 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617681 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617833 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.624098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.616926 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.617682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619103 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.619187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621107 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621235 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621320 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621478 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621608 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.621970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622043 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622065 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622849 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622829 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622945 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.624503 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.622966 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623195 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623287 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623652 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623690 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623897 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.623445 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.625022 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.625413 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.625558 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.625862 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.625995 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.626070 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.626338 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.626470 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.627341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.627558 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.627789 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.628443 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.628866 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.629096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.629523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.629843 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.630047 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.630246 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.630350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.630591 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.631363 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.631563 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.631124 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.632249 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.632288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.633921 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634536 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634585 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634517 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.635107 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.636070 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.636712 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.637141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.637974 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.638300 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.638562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.634936 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.639179 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.639340 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.639733 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.639806 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.639083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.645216 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.645523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.645698 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.645926 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:42.145903167 +0000 UTC m=+19.380374065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.646475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.650099 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.650224 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.650306 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.650325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.650405 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.650500 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:42.150470707 +0000 UTC m=+19.384941605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651013 4909 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.652289 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651301 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651356 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651656 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.651677 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.652405 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:42.152390865 +0000 UTC m=+19.386861763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651706 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.651853 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.652232 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.652525 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.653082 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.653490 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.654217 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.655163 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.656405 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.656490 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.657175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.656504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.657494 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.658058 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.660658 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.661661 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.661736 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.662895 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.663246 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.663244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.663441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.665201 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.665287 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.665306 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.665376 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:42.165356699 +0000 UTC m=+19.399827597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.666793 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.667276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.667508 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.669397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.670092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.671788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.673472 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.673515 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.673528 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:41 crc kubenswrapper[4909]: E1201 10:31:41.673596 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:42.173559776 +0000 UTC m=+19.408030664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676331 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676711 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.676978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.677123 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.681896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.682797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.683379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.683765 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.687959 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.687942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.688244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.688189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.688993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.689337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.689540 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.689568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.689741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.690170 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.690432 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.690807 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.691304 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.691696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.692522 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.692594 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.692673 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.692778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.693277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.693397 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.694465 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.695366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.695513 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.695730 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.696115 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.696326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.696414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.696493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.701689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.702415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722578 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722642 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722725 4909 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722739 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722753 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722768 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722780 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722790 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722801 4909 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722812 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722823 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722834 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722845 4909 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722856 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722868 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722901 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722911 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722923 4909 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722935 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722948 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.722977 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723005 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723018 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723029 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723040 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723052 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723062 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723070 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723079 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723088 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723112 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723135 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723147 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723167 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723183 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723196 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723205 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723214 4909 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723222 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723231 4909 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723240 4909 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723268 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723277 4909 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723285 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723293 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723322 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723332 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723340 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723349 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723357 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723365 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723375 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723383 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723385 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723403 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723476 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723488 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723500 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723511 4909 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723520 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723529 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723539 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723550 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723561 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723570 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723582 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723591 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723603 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723613 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723623 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723634 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723645 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723655 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723666 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723676 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723685 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723695 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723704 4909 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723714 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723725 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723734 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723753 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723762 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723771 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723781 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723794 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723803 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723811 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723821 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723830 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723839 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723849 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723857 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723866 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723889 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723899 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723908 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723917 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723926 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723937 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723946 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723954 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723963 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723977 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723987 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723997 4909 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724007 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724017 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724028 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724039 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724052 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724064 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724075 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724085 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724097 4909 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724108 4909 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724118 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724129 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724138 4909 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724148 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724159 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724173 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724185 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724195 4909 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724204 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724213 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724222 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724232 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724242 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724251 4909 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724260 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724269 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724278 4909 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724287 4909 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724299 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724309 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724318 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724326 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724336 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724345 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724354 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724363 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724374 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724383 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724392 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724401 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724411 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724420 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724429 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724440 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724449 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724457 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724467 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724477 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724485 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.724493 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.723109 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.725755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.726786 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.728400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.774094 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.774204 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.797751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.806484 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.809812 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.825115 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.825143 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.825153 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 10:31:41 crc kubenswrapper[4909]: W1201 10:31:41.829175 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a915717a0a947ef585038f7164aaecd60fbe5fdcc32c070691b99d8adc37a842 WatchSource:0}: Error finding container a915717a0a947ef585038f7164aaecd60fbe5fdcc32c070691b99d8adc37a842: Status 404 returned error can't find the container with id a915717a0a947ef585038f7164aaecd60fbe5fdcc32c070691b99d8adc37a842 Dec 01 10:31:41 crc kubenswrapper[4909]: W1201 10:31:41.830014 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6aacd101ac59d4645249f53a4e0c42b609b98934e528fe600b9688f23b42968e WatchSource:0}: Error finding container 6aacd101ac59d4645249f53a4e0c42b609b98934e528fe600b9688f23b42968e: Status 404 returned error can't find the container with id 6aacd101ac59d4645249f53a4e0c42b609b98934e528fe600b9688f23b42968e Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.956326 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.956894 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.956967 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.961224 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.971499 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.972144 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:41 crc kubenswrapper[4909]: I1201 10:31:41.996275 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.016025 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.031034 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.050971 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.068982 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.086083 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.099960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.115355 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.130621 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.143637 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.154533 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.165133 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.230740 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.230861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.230921 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.230863674 +0000 UTC m=+20.465334582 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.230962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.230995 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.231020 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231062 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.23104775 +0000 UTC m=+20.465518648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.231088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231109 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231174 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231186 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.231173214 +0000 UTC m=+20.465644112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231209 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231234 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231296 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.231279928 +0000 UTC m=+20.465750836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231562 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231581 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231593 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.231655 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:43.231626831 +0000 UTC m=+20.466097729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.256598 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.256603 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.256800 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.256626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.256917 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:42 crc kubenswrapper[4909]: E1201 10:31:42.256970 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.355677 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.357276 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1" exitCode=255 Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.357365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.358306 4909 scope.go:117] "RemoveContainer" containerID="fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.359023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.359080 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.359111 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a915717a0a947ef585038f7164aaecd60fbe5fdcc32c070691b99d8adc37a842"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.360606 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6aacd101ac59d4645249f53a4e0c42b609b98934e528fe600b9688f23b42968e"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.362073 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.362102 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cc46630db1b688d3b7329e563ba020b65c3a699848bca091af8b8c606f5470c8"} Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.385583 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.397567 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.408078 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.419272 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.430741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.443864 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.458071 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.476468 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.494538 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.494540 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.510971 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.511327 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.523330 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.525913 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.540742 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.553490 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.566268 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.571543 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-01 10:26:41 +0000 UTC, rotation deadline is 2026-08-16 12:24:29.916285437 +0000 UTC Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.571598 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6193h52m47.344689385s for next certificate rotation Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.584821 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.613755 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.642576 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.670467 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.690236 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tq5mk"] Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.690621 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.692866 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.693111 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.693381 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.697647 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.718576 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.739404 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.754933 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.767987 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.780959 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.793943 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.808278 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.822565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.835903 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbgg\" (UniqueName: \"kubernetes.io/projected/9b1085bc-c2a2-4155-a342-30a9db598319-kube-api-access-xrbgg\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.835973 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b1085bc-c2a2-4155-a342-30a9db598319-hosts-file\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.844846 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.862533 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.875087 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.898432 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:42Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.936759 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbgg\" (UniqueName: \"kubernetes.io/projected/9b1085bc-c2a2-4155-a342-30a9db598319-kube-api-access-xrbgg\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.936819 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b1085bc-c2a2-4155-a342-30a9db598319-hosts-file\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.936914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9b1085bc-c2a2-4155-a342-30a9db598319-hosts-file\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:42 crc kubenswrapper[4909]: I1201 10:31:42.954615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbgg\" (UniqueName: \"kubernetes.io/projected/9b1085bc-c2a2-4155-a342-30a9db598319-kube-api-access-xrbgg\") pod \"node-resolver-tq5mk\" (UID: \"9b1085bc-c2a2-4155-a342-30a9db598319\") " pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.003128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tq5mk" Dec 01 10:31:43 crc kubenswrapper[4909]: W1201 10:31:43.014169 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1085bc_c2a2_4155_a342_30a9db598319.slice/crio-1c2a8cdb6006a17d65feec67ddc8fe7377fd92772d8a90cf241a194dc69da2d7 WatchSource:0}: Error finding container 1c2a8cdb6006a17d65feec67ddc8fe7377fd92772d8a90cf241a194dc69da2d7: Status 404 returned error can't find the container with id 1c2a8cdb6006a17d65feec67ddc8fe7377fd92772d8a90cf241a194dc69da2d7 Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.059941 4909 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 01 10:31:43 crc kubenswrapper[4909]: W1201 10:31:43.060366 4909 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 01 10:31:43 crc kubenswrapper[4909]: W1201 10:31:43.060607 4909 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 01 10:31:43 crc kubenswrapper[4909]: W1201 10:31:43.060634 4909 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.085366 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5rks"] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.086296 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hr4n5"] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.086571 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.086751 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4pcf2"] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.087053 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.087158 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.088023 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2qpdc"] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.088357 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.090491 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091085 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091447 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091531 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091647 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091709 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091746 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091747 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091720 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091845 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.091887 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.092135 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.092693 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.093841 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.093886 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.093943 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.093962 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.094078 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.115076 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.135021 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.146429 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.168968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.191806 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.205677 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.230752 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.239612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.239779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.239841 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:45.239804146 +0000 UTC m=+22.474275094 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.239925 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.239975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.239999 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240027 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cnibin\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240093 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-multus-daemon-config\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-multus-certs\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240183 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-k8s-cni-cncf-io\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-etc-kubernetes\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2w8\" (UniqueName: \"kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.240416 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.240458 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.240476 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240435 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/672850e4-d044-44cc-b8a2-517dc1a285be-rootfs\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.240548 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:45.240536852 +0000 UTC m=+22.475007940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240596 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240655 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-bin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.240776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-os-release\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241437 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-socket-dir-parent\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241539 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-system-cni-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241832 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241897 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sm45\" (UniqueName: \"kubernetes.io/projected/db1501e3-b64b-4bbf-97ec-85f97fb68afb-kube-api-access-6sm45\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.241977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mwrw\" (UniqueName: \"kubernetes.io/projected/89f06a94-5047-41d9-90a3-8433149d22c4-kube-api-access-6mwrw\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242083 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk5nt\" (UniqueName: \"kubernetes.io/projected/672850e4-d044-44cc-b8a2-517dc1a285be-kube-api-access-lk5nt\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242171 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-system-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-hostroot\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-conf-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242357 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/672850e4-d044-44cc-b8a2-517dc1a285be-mcd-auth-proxy-config\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242509 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242541 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-cni-binary-copy\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-netns\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242597 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-multus\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-kubelet\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-cnibin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-os-release\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242756 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.242846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/672850e4-d044-44cc-b8a2-517dc1a285be-proxy-tls\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243217 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243274 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:45.243264408 +0000 UTC m=+22.477735306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243392 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243412 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243431 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243470 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:45.243456293 +0000 UTC m=+22.477927191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243665 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.243759 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:45.243735463 +0000 UTC m=+22.478206361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.251918 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.263518 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.264234 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.266096 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.266975 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.268390 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.268726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.269778 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.272069 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.273716 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.274518 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.275658 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.276334 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.277981 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.281834 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.282919 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.283598 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.285291 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.286391 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.287511 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.288459 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.289233 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.290373 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.291160 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.291209 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.292301 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.293213 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.293733 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.295626 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.296508 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.298721 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.299667 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.300827 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.301470 4909 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.301603 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.304522 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.305289 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.305906 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.308802 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.310573 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.311340 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.316571 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.317711 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.318831 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.319602 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.321131 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.322303 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.322475 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.323131 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.324311 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.325066 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.326703 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.327429 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.328070 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.329862 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.331980 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.332741 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.334174 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345261 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-os-release\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-socket-dir-parent\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-bin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345383 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345400 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-system-cni-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sm45\" (UniqueName: \"kubernetes.io/projected/db1501e3-b64b-4bbf-97ec-85f97fb68afb-kube-api-access-6sm45\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-system-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-hostroot\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345502 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mwrw\" (UniqueName: \"kubernetes.io/projected/89f06a94-5047-41d9-90a3-8433149d22c4-kube-api-access-6mwrw\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk5nt\" (UniqueName: \"kubernetes.io/projected/672850e4-d044-44cc-b8a2-517dc1a285be-kube-api-access-lk5nt\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-conf-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345593 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345622 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345638 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/672850e4-d044-44cc-b8a2-517dc1a285be-mcd-auth-proxy-config\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345722 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345745 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-cni-binary-copy\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-netns\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345776 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-multus\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-kubelet\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345805 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-cnibin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346113 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.345819 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-os-release\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/672850e4-d044-44cc-b8a2-517dc1a285be-proxy-tls\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346427 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346513 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cnibin\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346535 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-multus-daemon-config\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-multus-certs\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-conf-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346589 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346652 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-etc-kubernetes\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-k8s-cni-cncf-io\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346738 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346762 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2w8\" (UniqueName: \"kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346785 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/672850e4-d044-44cc-b8a2-517dc1a285be-rootfs\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346905 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/672850e4-d044-44cc-b8a2-517dc1a285be-rootfs\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.346945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-os-release\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-socket-dir-parent\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347243 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-bin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347270 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/672850e4-d044-44cc-b8a2-517dc1a285be-mcd-auth-proxy-config\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347385 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347862 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-system-cni-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.347959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348142 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348197 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348235 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-etc-kubernetes\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348269 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-system-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348299 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-hostroot\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348360 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-k8s-cni-cncf-io\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348428 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-netns\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348639 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-cni-multus\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348661 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-var-lib-kubelet\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-cnibin\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348727 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-os-release\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348813 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348926 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.348996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-multus-cni-dir\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.349610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-multus-daemon-config\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.349665 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db1501e3-b64b-4bbf-97ec-85f97fb68afb-cnibin\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.349703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/89f06a94-5047-41d9-90a3-8433149d22c4-host-run-multus-certs\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.350462 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89f06a94-5047-41d9-90a3-8433149d22c4-cni-binary-copy\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.357201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/672850e4-d044-44cc-b8a2-517dc1a285be-proxy-tls\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.359511 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.361460 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.377284 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mwrw\" (UniqueName: \"kubernetes.io/projected/89f06a94-5047-41d9-90a3-8433149d22c4-kube-api-access-6mwrw\") pod \"multus-2qpdc\" (UID: \"89f06a94-5047-41d9-90a3-8433149d22c4\") " pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.380777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tq5mk" event={"ID":"9b1085bc-c2a2-4155-a342-30a9db598319","Type":"ContainerStarted","Data":"1c2a8cdb6006a17d65feec67ddc8fe7377fd92772d8a90cf241a194dc69da2d7"} Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.381497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sm45\" (UniqueName: \"kubernetes.io/projected/db1501e3-b64b-4bbf-97ec-85f97fb68afb-kube-api-access-6sm45\") pod \"multus-additional-cni-plugins-hr4n5\" (UID: \"db1501e3-b64b-4bbf-97ec-85f97fb68afb\") " pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.382937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk5nt\" (UniqueName: \"kubernetes.io/projected/672850e4-d044-44cc-b8a2-517dc1a285be-kube-api-access-lk5nt\") pod \"machine-config-daemon-4pcf2\" (UID: \"672850e4-d044-44cc-b8a2-517dc1a285be\") " pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.382937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2w8\" (UniqueName: \"kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8\") pod \"ovnkube-node-j5rks\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.383392 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.387682 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.390730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45"} Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.390888 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.402655 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.411221 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" Dec 01 10:31:43 crc kubenswrapper[4909]: E1201 10:31:43.421403 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.422220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.422514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.430231 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2qpdc" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.474125 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.505924 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.564303 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.583532 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.606269 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.620089 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.639038 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.655526 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.672545 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.687785 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.698587 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.714105 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.733217 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.747146 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.767474 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.781258 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.795735 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.812584 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.828867 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.844137 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.857221 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:43 crc kubenswrapper[4909]: I1201 10:31:43.869376 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.244026 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.256529 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.256588 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.256690 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:44 crc kubenswrapper[4909]: E1201 10:31:44.256815 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:44 crc kubenswrapper[4909]: E1201 10:31:44.256979 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:44 crc kubenswrapper[4909]: E1201 10:31:44.257145 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.270338 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.343819 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.395596 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b" exitCode=0 Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.395693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.395963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerStarted","Data":"df4e02231b26a3bc708c592500a2df0a285981c7626b7a311867b374c5e524cc"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.398284 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.398316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.398328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"98e8629d47890f19dba2423eddbfae9d69c2dfa1a006930f4cc6889b2ead5f9a"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.399691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerStarted","Data":"74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.399758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerStarted","Data":"fb1ebfbdccf22b611c7410eb9390f470acd72560bc894fff2286a7b97e29450b"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.401390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.402736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tq5mk" event={"ID":"9b1085bc-c2a2-4155-a342-30a9db598319","Type":"ContainerStarted","Data":"edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.404396 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" exitCode=0 Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.405007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.405034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"7eb314174062af013b837397c3d50d0a81173d8221ce9d2f03c041dc9b1c86c9"} Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.433217 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.492374 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.518165 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.543533 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.561163 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.583173 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.598842 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.612378 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.626689 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.640592 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.650924 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.666572 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.684186 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.699109 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.712942 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.728306 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.741193 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.757119 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.771748 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.789331 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.807725 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.832467 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.851504 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.865903 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.886551 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.899555 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.924843 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.928346 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.934669 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.942223 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:44 crc kubenswrapper[4909]: I1201 10:31:44.968713 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.014143 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.046903 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.100757 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.129618 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.171445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.208632 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.248700 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.272987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.273157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.273192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.273215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.273243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273365 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273427 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:49.273410001 +0000 UTC m=+26.507880899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273842 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:49.273831176 +0000 UTC m=+26.508302074 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273861 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273932 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273949 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273961 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273965 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.273991 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:49.273963921 +0000 UTC m=+26.508434839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.274004 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.274021 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:49.274008752 +0000 UTC m=+26.508479660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.274022 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:45 crc kubenswrapper[4909]: E1201 10:31:45.274085 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:49.274076455 +0000 UTC m=+26.508547363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.289054 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.326258 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.370746 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.411665 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.416395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.417436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.417643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.417729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.417751 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.417764 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.421754 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03" exitCode=0 Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.421779 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03"} Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.456859 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.487478 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.531254 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.568092 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.611565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.646639 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.690849 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.728036 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.769583 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.808310 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.847297 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.887540 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.942919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:45 crc kubenswrapper[4909]: I1201 10:31:45.979529 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.010175 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.047569 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.051671 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.054312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.054351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.054364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.054487 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.074304 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qggws"] Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.074820 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.107889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.119599 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.138803 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.159241 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.178677 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.181240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b27824f-0660-47f4-b7d7-dbe4b908854c-serviceca\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.181289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b27824f-0660-47f4-b7d7-dbe4b908854c-host\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.181330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9t8\" (UniqueName: \"kubernetes.io/projected/6b27824f-0660-47f4-b7d7-dbe4b908854c-kube-api-access-hq9t8\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.199198 4909 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.199657 4909 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.201133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.201250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.201335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.201445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.201556 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.220450 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.223925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.224075 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.224162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.224257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.224358 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.236598 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.240835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.240867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.240892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.240911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.240923 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.247572 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.253662 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.256237 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.256264 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.256262 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.256416 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.256562 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.256650 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.258388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.258435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.258452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.258476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.258492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.270646 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.275306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.275361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.275373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.275390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.275400 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.282445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9t8\" (UniqueName: \"kubernetes.io/projected/6b27824f-0660-47f4-b7d7-dbe4b908854c-kube-api-access-hq9t8\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.282733 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b27824f-0660-47f4-b7d7-dbe4b908854c-serviceca\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.282818 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b27824f-0660-47f4-b7d7-dbe4b908854c-host\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.282981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b27824f-0660-47f4-b7d7-dbe4b908854c-host\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.283759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b27824f-0660-47f4-b7d7-dbe4b908854c-serviceca\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.287984 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.287857 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: E1201 10:31:46.288150 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.290261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.290305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.290320 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.290342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.290358 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.319500 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9t8\" (UniqueName: \"kubernetes.io/projected/6b27824f-0660-47f4-b7d7-dbe4b908854c-kube-api-access-hq9t8\") pod \"node-ca-qggws\" (UID: \"6b27824f-0660-47f4-b7d7-dbe4b908854c\") " pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.352206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.386245 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.390490 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qggws" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.393511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.393553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.393597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.393618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.393631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.428252 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8" exitCode=0 Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.428303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.429463 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qggws" event={"ID":"6b27824f-0660-47f4-b7d7-dbe4b908854c","Type":"ContainerStarted","Data":"d551efda5ab1da62ea07f4b645864abc265c9f4b2d8534f462ef99c58add03b1"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.436370 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.468182 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.498916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.499333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.499536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.499565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.499581 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.510356 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.551051 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.589043 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.602154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.602209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.602219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.602237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.602247 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.628363 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.668012 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.704773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.704824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.704834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.704852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.704864 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.708765 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.747806 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.802200 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.807463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.807489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.807498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.807513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.807522 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.831294 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.878862 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.911593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.911643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.911657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.911674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.911684 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:46Z","lastTransitionTime":"2025-12-01T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.918362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.953081 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:46 crc kubenswrapper[4909]: I1201 10:31:46.994738 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.014824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.014890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.014902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.014920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.014934 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.033008 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.073516 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.114041 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.118740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.118827 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.118846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.118915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.118944 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.150270 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.191401 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.221698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.221765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.221782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.221809 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.221831 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.233293 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.269931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.329783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.329834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.329847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.329866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.329910 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.432667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.432713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.432725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.432742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.432756 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.435358 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qggws" event={"ID":"6b27824f-0660-47f4-b7d7-dbe4b908854c","Type":"ContainerStarted","Data":"9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.441304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.444545 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf" exitCode=0 Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.444599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.455521 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.472482 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.487708 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.501005 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.524296 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537224 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.537262 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.553999 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.589742 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.627573 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.640828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.640890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.640906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.640924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.640937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.682291 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.711557 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.744339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.744381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.744389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.744404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.744416 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.750079 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.789091 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.838253 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.847835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.847867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.847896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.847915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.847929 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.868328 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.913068 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.947995 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.950154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.950192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.950205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.950229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.950242 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:47Z","lastTransitionTime":"2025-12-01T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:47 crc kubenswrapper[4909]: I1201 10:31:47.993847 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.028440 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.052866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.052917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.052926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.052940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.052949 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.077829 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.117466 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.151497 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.160759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.160793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.160802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.160816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.161719 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.186367 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.232023 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.256709 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.256903 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:48 crc kubenswrapper[4909]: E1201 10:31:48.257087 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.257170 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:48 crc kubenswrapper[4909]: E1201 10:31:48.257488 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:48 crc kubenswrapper[4909]: E1201 10:31:48.257558 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.267138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.267179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.267196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.267216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.267230 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.270838 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.313548 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.354927 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.372086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.372148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.372161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.372185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.372200 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.391931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.431037 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.452173 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e" exitCode=0 Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.452255 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.473628 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.477338 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.477398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.477417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.477445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.477464 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.510803 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.552212 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.580370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.580416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.580428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.580450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.580464 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.596822 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.628694 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.673793 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.683677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.683718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.683729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.683745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.683757 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.708253 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.754585 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.786716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.786771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.786787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.786810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.786823 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.797968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.830553 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.870837 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.888709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.888762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.888774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.888816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.888832 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.910817 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.948496 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.992291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.992355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.992369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.992405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.992426 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:48Z","lastTransitionTime":"2025-12-01T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:48 crc kubenswrapper[4909]: I1201 10:31:48.993798 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:48Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.028833 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.072496 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.096014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.096059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.096072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.096092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.096109 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.199618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.199673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.199687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.199708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.199725 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.302592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.302652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.302665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.302686 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.302705 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.319245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319420 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:31:57.319391249 +0000 UTC m=+34.553862327 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.319482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.319546 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.319604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.319646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319665 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319738 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:57.31971667 +0000 UTC m=+34.554187578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319804 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319833 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319856 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319857 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319894 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319903 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319921 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319945 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:57.319921868 +0000 UTC m=+34.554392766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319974 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:57.319953929 +0000 UTC m=+34.554424847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:49 crc kubenswrapper[4909]: E1201 10:31:49.319998 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:31:57.31998764 +0000 UTC m=+34.554458558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.412013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.412096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.412115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.412144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.412164 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.458099 4909 generic.go:334] "Generic (PLEG): container finished" podID="db1501e3-b64b-4bbf-97ec-85f97fb68afb" containerID="cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5" exitCode=0 Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.458372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerDied","Data":"cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.484469 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.500274 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.515325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.529518 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.550105 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.562393 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.577483 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.592492 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.607536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.618775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.618811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.618825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.618847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.618861 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.621015 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.635695 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.649724 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.659811 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.674261 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.692905 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:49Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.721863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.721934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.721942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.721956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.721968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.824570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.825276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.825299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.825336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.825364 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.927782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.927823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.927832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.927849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:49 crc kubenswrapper[4909]: I1201 10:31:49.927859 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:49Z","lastTransitionTime":"2025-12-01T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.030701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.030783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.030806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.030834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.030853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.133020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.133069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.133080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.133102 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.133116 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.235739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.235808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.235832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.235859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.235921 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.256471 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.256472 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.256675 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:50 crc kubenswrapper[4909]: E1201 10:31:50.256771 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:50 crc kubenswrapper[4909]: E1201 10:31:50.257033 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:50 crc kubenswrapper[4909]: E1201 10:31:50.257131 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.339309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.339386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.339407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.339434 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.339457 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.442988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.443063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.443085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.443116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.443138 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.468116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.468590 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.468674 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.474955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" event={"ID":"db1501e3-b64b-4bbf-97ec-85f97fb68afb","Type":"ContainerStarted","Data":"ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.489173 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.527023 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.527102 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.527350 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.545602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.545634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.545663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.545680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.545690 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.549551 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.565096 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.578244 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.592337 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.615340 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.628489 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.645674 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.648228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.648263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.648272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.648290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.648303 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.667450 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.691216 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.712444 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.733166 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.751835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.751946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.751968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.751997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.752020 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.753666 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.770014 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.790335 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.807707 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.823643 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.835960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.850756 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.855315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.855374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.855392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.855416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.855430 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.868400 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.882244 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.901833 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.927002 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.945092 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.957748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.957796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.957809 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.957827 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.957840 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:50Z","lastTransitionTime":"2025-12-01T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:50 crc kubenswrapper[4909]: I1201 10:31:50.979568 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:50Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.003667 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.017927 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.033065 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.058485 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:51Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.060022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.060096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.060115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.060145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.060166 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.163984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.164061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.164081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.164109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.164131 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.266703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.266765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.266785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.266801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.266815 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.370894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.370959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.370975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.371002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.371019 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.474123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.474188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.474204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.474228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.474243 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.481914 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.623515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.623571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.623584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.623605 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.623619 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.726822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.726863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.726888 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.726906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.726916 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.829404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.829451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.829461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.829481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.829493 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.931858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.931942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.931955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.931980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:51 crc kubenswrapper[4909]: I1201 10:31:51.931990 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:51Z","lastTransitionTime":"2025-12-01T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.034613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.034657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.034666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.034683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.034697 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.137098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.137173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.137189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.137213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.137229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.240324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.240423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.240452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.240493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.240521 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.256699 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.256796 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.256782 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:52 crc kubenswrapper[4909]: E1201 10:31:52.257059 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:52 crc kubenswrapper[4909]: E1201 10:31:52.257162 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:52 crc kubenswrapper[4909]: E1201 10:31:52.257244 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.344080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.344145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.344163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.344190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.344210 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.447679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.447736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.447748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.447772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.447786 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.489444 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/0.log" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.494588 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee" exitCode=1 Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.494652 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.495749 4909 scope.go:117] "RemoveContainer" containerID="bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.527529 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.553528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.553607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.553631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.553665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.553691 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.560341 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.599067 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.613582 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.626960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.640931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.657384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.657455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.657479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.657503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.657539 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.667638 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:52Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 10:31:52.180053 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:52.180071 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:52.180111 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:52.180146 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:52.180162 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:52.180171 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:52.180187 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:52.180196 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:52.180200 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:52.180209 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:52.180220 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:52.180271 6232 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:52.180284 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 10:31:52.180359 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:52.180360 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.681974 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.698979 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.714314 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.729577 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.743998 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.760217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.760270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.760282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.760298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.760312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.763605 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.778485 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.790276 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:52Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.863085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.863511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.863522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.863539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.863551 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.966160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.966241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.966255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.966276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:52 crc kubenswrapper[4909]: I1201 10:31:52.966289 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:52Z","lastTransitionTime":"2025-12-01T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.070600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.071012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.071106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.071203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.071283 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.175128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.175189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.175203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.175227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.175243 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.278140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.278202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.278216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.278240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.278255 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.285129 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:52Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 10:31:52.180053 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:52.180071 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:52.180111 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:52.180146 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:52.180162 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:52.180171 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:52.180187 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:52.180196 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:52.180200 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:52.180209 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:52.180220 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:52.180271 6232 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:52.180284 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 10:31:52.180359 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:52.180360 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.298980 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.320746 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.352407 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.367989 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.381314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.381372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.381384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.381405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.381419 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.384855 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.400364 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.415847 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.432454 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.447749 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.467626 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.481477 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.484537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.484593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.484604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.484624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.484636 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.492975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.499036 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/0.log" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.501837 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.501932 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.508101 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.523708 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.538327 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.552491 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.572513 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.587511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.587778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.587892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.587967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.588030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.589108 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.608559 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.621533 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.631893 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.644103 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.658731 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.676698 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.688161 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.691341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.691387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.691399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.691416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.691427 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.701622 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.713603 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.730198 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:52Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 10:31:52.180053 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:52.180071 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:52.180111 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:52.180146 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:52.180162 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:52.180171 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:52.180187 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:52.180196 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:52.180200 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:52.180209 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:52.180220 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:52.180271 6232 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:52.180284 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 10:31:52.180359 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:52.180360 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.741530 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.794214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.794272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.794286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.794303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.794314 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.898426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.898491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.898505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.898525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:53 crc kubenswrapper[4909]: I1201 10:31:53.898538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:53Z","lastTransitionTime":"2025-12-01T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.001818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.001865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.001903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.001923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.001937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.106513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.106587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.106609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.106642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.106667 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.210520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.211077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.211335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.211550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.211745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.256982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:54 crc kubenswrapper[4909]: E1201 10:31:54.257222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.256982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:54 crc kubenswrapper[4909]: E1201 10:31:54.257370 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.256982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:54 crc kubenswrapper[4909]: E1201 10:31:54.257470 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.314618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.314848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.314935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.315001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.315087 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.417361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.417402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.417412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.417428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.417438 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.505732 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/1.log" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.507986 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/0.log" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.516796 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a" exitCode=1 Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.516861 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.517056 4909 scope.go:117] "RemoveContainer" containerID="bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.518036 4909 scope.go:117] "RemoveContainer" containerID="99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a" Dec 01 10:31:54 crc kubenswrapper[4909]: E1201 10:31:54.518447 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.524723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.524779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.524796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.524823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.524836 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.538636 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.552743 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.573177 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:52Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 10:31:52.180053 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:52.180071 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:52.180111 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:52.180146 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:52.180162 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:52.180171 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:52.180187 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:52.180196 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:52.180200 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:52.180209 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:52.180220 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:52.180271 6232 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:52.180284 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 10:31:52.180359 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:52.180360 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.585812 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.602596 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.616825 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.627500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.627763 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.627846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.628023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.628059 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.634254 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.651561 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.673126 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.687821 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.717248 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.730069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.730119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.730131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.730150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.730164 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.740931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.765022 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.779948 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.795171 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:54Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.832021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.832061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.832070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.832086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.832096 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.934828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.934860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.934884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.934898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:54 crc kubenswrapper[4909]: I1201 10:31:54.934907 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:54Z","lastTransitionTime":"2025-12-01T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.037256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.037296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.037306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.037322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.037335 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.140482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.140527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.140538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.140553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.140565 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.243448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.243485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.243495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.243510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.243520 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.346107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.346164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.346175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.346194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.346206 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.449615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.449688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.449706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.449736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.449753 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.524170 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p"] Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.524769 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.525712 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/1.log" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.528274 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.528648 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.534291 4909 scope.go:117] "RemoveContainer" containerID="99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a" Dec 01 10:31:55 crc kubenswrapper[4909]: E1201 10:31:55.534896 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.545193 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.560493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.560557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.560572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.560595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.560612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.562521 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.579206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.588921 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.588986 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caaa6b24-fb98-4908-b7a7-929c44181c99-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.589031 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd82w\" (UniqueName: \"kubernetes.io/projected/caaa6b24-fb98-4908-b7a7-929c44181c99-kube-api-access-sd82w\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.589108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.599653 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.616802 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.631695 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.646718 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.663866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.663927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.663938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.663953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.663963 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.673089 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd92e85c3e56138789470a95773fbe3854ad50ab7b850542e6f781efb70156ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:52Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1201 10:31:52.180053 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 10:31:52.180071 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 10:31:52.180111 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:31:52.180146 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 10:31:52.180162 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 10:31:52.180171 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 10:31:52.180187 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 10:31:52.180196 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:31:52.180200 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 10:31:52.180209 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 10:31:52.180220 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:31:52.180271 6232 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:31:52.180284 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 10:31:52.180359 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 10:31:52.180360 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.690319 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.690400 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.690429 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caaa6b24-fb98-4908-b7a7-929c44181c99-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.690472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd82w\" (UniqueName: \"kubernetes.io/projected/caaa6b24-fb98-4908-b7a7-929c44181c99-kube-api-access-sd82w\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.691786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.692194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caaa6b24-fb98-4908-b7a7-929c44181c99-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.695963 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.699077 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caaa6b24-fb98-4908-b7a7-929c44181c99-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.711179 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.719261 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd82w\" (UniqueName: \"kubernetes.io/projected/caaa6b24-fb98-4908-b7a7-929c44181c99-kube-api-access-sd82w\") pod \"ovnkube-control-plane-749d76644c-8dv5p\" (UID: \"caaa6b24-fb98-4908-b7a7-929c44181c99\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.727286 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.740231 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.754857 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.766950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.767004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.767108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.767187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.767204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.772077 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.787359 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.799577 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.819732 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.833129 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.841360 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.861126 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.870252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.870335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.870347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.870368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.870381 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.882296 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.899416 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.925540 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.943613 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.959278 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.973605 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.973652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.973663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.973680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.973694 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:55Z","lastTransitionTime":"2025-12-01T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.974226 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:55 crc kubenswrapper[4909]: I1201 10:31:55.997357 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:55Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.009821 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.024005 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.039189 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.055576 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.067774 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.077083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.077154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.077164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.077187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.077199 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.080298 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.179758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.179820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.179842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.179907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.179931 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.257215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.257318 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.257310 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.257504 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.257799 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.258093 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.283030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.283083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.283098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.283117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.283130 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.316340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.316408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.316418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.316454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.316467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.329782 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.333647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.333691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.333706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.333730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.333747 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.347806 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.352494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.352529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.352573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.352595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.352608 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.367086 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.371109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.371157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.371176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.371203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.371222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.385203 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.389991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.390027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.390042 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.390063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.390075 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.405622 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:56Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:56 crc kubenswrapper[4909]: E1201 10:31:56.405751 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.407548 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.407577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.407589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.407619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.407632 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.510398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.510433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.510442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.510462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.510491 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.539103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" event={"ID":"caaa6b24-fb98-4908-b7a7-929c44181c99","Type":"ContainerStarted","Data":"ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.539171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" event={"ID":"caaa6b24-fb98-4908-b7a7-929c44181c99","Type":"ContainerStarted","Data":"a6ffdc1bcc70c693ce41518d7091b5be14c5a2b89faffd12d27f5507e50ec8c0"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.613555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.613818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.613948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.614084 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.614195 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.716855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.716910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.716921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.716940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.716950 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.819155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.819188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.819199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.819213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.819222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.922812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.923406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.923460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.923486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:56 crc kubenswrapper[4909]: I1201 10:31:56.923505 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:56Z","lastTransitionTime":"2025-12-01T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.026606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.026663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.026675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.026698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.026716 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.129300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.129350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.129359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.129377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.129388 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.188060 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.206065 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.230817 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.233254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.233336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.233354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.233385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.233405 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.247465 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.263679 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.279106 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.295401 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.310531 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.322186 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.336806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.336916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.336930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.336951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.337019 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.340660 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.361379 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.366160 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-z48j9"] Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.366974 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.367083 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.380352 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.396245 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.408726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.408921 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.408889978 +0000 UTC m=+50.643360886 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.408977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.409039 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm7r6\" (UniqueName: \"kubernetes.io/projected/dca0394a-c980-4220-ab44-d2f55519cb1a-kube-api-access-mm7r6\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.409073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.409099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.409124 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.409157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409190 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409213 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409231 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409259 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409281 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.4092705 +0000 UTC m=+50.643741398 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409312 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.409302171 +0000 UTC m=+50.643773069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409352 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409404 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409436 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409458 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409494 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.409461357 +0000 UTC m=+50.643932445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.409523 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.409511089 +0000 UTC m=+50.643982227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.416254 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.432623 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.439411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.439473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.439493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.439523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.439560 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.458593 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.473102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.485931 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.500155 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.510076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm7r6\" (UniqueName: \"kubernetes.io/projected/dca0394a-c980-4220-ab44-d2f55519cb1a-kube-api-access-mm7r6\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.510169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.510307 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: E1201 10:31:57.510362 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:31:58.010348972 +0000 UTC m=+35.244819870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.511541 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.522815 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.529309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm7r6\" (UniqueName: \"kubernetes.io/projected/dca0394a-c980-4220-ab44-d2f55519cb1a-kube-api-access-mm7r6\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.542182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.542286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.542308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.542334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.542354 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.543542 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.545021 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" event={"ID":"caaa6b24-fb98-4908-b7a7-929c44181c99","Type":"ContainerStarted","Data":"fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.558611 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.570717 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.581157 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.597644 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.612716 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.627351 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.639833 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.644609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.644638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.644649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.644682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.644694 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.651193 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.662709 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.676842 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.689351 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.702158 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.712039 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.724424 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.735726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.746565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.747439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.747474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.747485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.747503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.747514 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.759096 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.774497 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.790859 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.805220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.818536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.839774 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.849862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.849916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.849928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.849947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.849960 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.853943 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.876030 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.887057 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.897821 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.906638 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.917653 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.929346 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:31:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.952448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.952496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.952511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.952529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:57 crc kubenswrapper[4909]: I1201 10:31:57.952542 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:57Z","lastTransitionTime":"2025-12-01T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.016516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:58 crc kubenswrapper[4909]: E1201 10:31:58.016756 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:58 crc kubenswrapper[4909]: E1201 10:31:58.016865 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:31:59.016843459 +0000 UTC m=+36.251314357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.054804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.054864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.054911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.054943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.054965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.158068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.158104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.158116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.158133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.158145 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.257193 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:31:58 crc kubenswrapper[4909]: E1201 10:31:58.257340 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.257213 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.257401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:31:58 crc kubenswrapper[4909]: E1201 10:31:58.257451 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:31:58 crc kubenswrapper[4909]: E1201 10:31:58.257633 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.261061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.261177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.261254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.261322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.261406 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.365173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.365210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.365222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.365240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.365252 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.468203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.468253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.468265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.468283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.468294 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.570735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.570774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.570783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.570798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.570810 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.673798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.673854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.673867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.673910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.673924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.776810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.777179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.777326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.777448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.777564 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.880462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.880502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.880514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.880529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.880540 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.983178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.983209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.983217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.983230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:58 crc kubenswrapper[4909]: I1201 10:31:58.983246 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:58Z","lastTransitionTime":"2025-12-01T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.027800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:59 crc kubenswrapper[4909]: E1201 10:31:59.027938 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:59 crc kubenswrapper[4909]: E1201 10:31:59.027990 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:32:01.027973648 +0000 UTC m=+38.262444546 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.085270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.085479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.085631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.085716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.085798 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.187545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.187582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.187592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.187605 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.187613 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.256267 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:31:59 crc kubenswrapper[4909]: E1201 10:31:59.256559 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.289805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.289951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.289968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.289987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.289997 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.393100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.393137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.393147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.393163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.393172 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.495728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.495774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.495788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.495807 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.495817 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.597936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.598005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.598038 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.598067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.598087 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.700735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.700787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.700801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.700818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.700831 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.803308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.803360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.803376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.803400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.803415 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.905984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.906037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.906046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.906064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:31:59 crc kubenswrapper[4909]: I1201 10:31:59.906073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:31:59Z","lastTransitionTime":"2025-12-01T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.008593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.008647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.008664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.008694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.008714 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.112104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.112197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.112224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.112257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.112282 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.214773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.214838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.214852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.214904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.214928 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.256921 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.256933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:00 crc kubenswrapper[4909]: E1201 10:32:00.257248 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.256951 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:00 crc kubenswrapper[4909]: E1201 10:32:00.257317 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:00 crc kubenswrapper[4909]: E1201 10:32:00.257520 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.319019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.319124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.319145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.319204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.319224 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.423439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.423541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.423592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.423617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.423637 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.530820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.530925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.530944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.530968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.530987 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.634118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.634153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.634162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.634181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.634195 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.737128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.737199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.737227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.737259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.737281 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.841425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.841481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.841493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.841525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.841538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.945282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.945373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.945392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.945424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:00 crc kubenswrapper[4909]: I1201 10:32:00.945444 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:00Z","lastTransitionTime":"2025-12-01T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.048711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.048785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.048808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.048845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.048868 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.051225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:01 crc kubenswrapper[4909]: E1201 10:32:01.051406 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:01 crc kubenswrapper[4909]: E1201 10:32:01.051476 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:32:05.0514567 +0000 UTC m=+42.285927598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.153014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.153070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.153089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.153116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.153137 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.255749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.255794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.255804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.255820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.255832 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.256341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:01 crc kubenswrapper[4909]: E1201 10:32:01.256605 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.359412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.359764 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.359918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.360198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.360299 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.463642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.463716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.463737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.463767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.463789 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.567206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.567285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.567303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.567327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.567348 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.671494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.671556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.671574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.671604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.671626 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.779951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.780043 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.780116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.780154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.780180 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.883299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.883378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.883397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.883428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:01 crc kubenswrapper[4909]: I1201 10:32:01.883450 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:01Z","lastTransitionTime":"2025-12-01T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.010564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.010629 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.010642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.010666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.010680 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.113456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.113521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.113535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.113558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.113572 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.217272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.217376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.217398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.217429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.217451 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.256914 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.257026 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.256930 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:02 crc kubenswrapper[4909]: E1201 10:32:02.257177 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:02 crc kubenswrapper[4909]: E1201 10:32:02.257314 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:02 crc kubenswrapper[4909]: E1201 10:32:02.257513 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.319702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.319777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.319797 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.319826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.319845 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.423225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.423531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.423613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.423689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.423764 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.527628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.527673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.527684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.527704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.527717 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.631794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.632324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.632794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.633189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.633352 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.737518 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.737600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.737622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.737659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.737683 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.841507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.841611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.841637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.841674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.841694 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.945656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.945726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.945746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.945776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:02 crc kubenswrapper[4909]: I1201 10:32:02.945797 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:02Z","lastTransitionTime":"2025-12-01T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.050040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.050098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.050118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.050147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.050167 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.154428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.154530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.154549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.154579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.154602 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.257127 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:03 crc kubenswrapper[4909]: E1201 10:32:03.257450 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.259074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.259143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.259168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.259204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.259229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.280803 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.303919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.321312 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.351925 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.361456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.361513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.361533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.361561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.361636 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.377701 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.405688 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.436673 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.454207 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.464157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.464214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.464228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.464252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.464269 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.470165 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.484728 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.513999 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.554199 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.567168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.567223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.567234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.567249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.567258 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.587212 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.610076 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.625947 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.639894 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.657274 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.669242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.669285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.669301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.669316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.669325 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.773353 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.773401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.773416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.773438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.773455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.876740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.876812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.876837 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.876867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.876990 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.980004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.980077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.980098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.980542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:03 crc kubenswrapper[4909]: I1201 10:32:03.980599 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:03Z","lastTransitionTime":"2025-12-01T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.084640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.084674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.084691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.084709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.084720 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.187913 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.187982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.188013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.188045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.188063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.257174 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.257308 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:04 crc kubenswrapper[4909]: E1201 10:32:04.257429 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.257527 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:04 crc kubenswrapper[4909]: E1201 10:32:04.257782 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:04 crc kubenswrapper[4909]: E1201 10:32:04.257968 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.291060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.291167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.291183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.291210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.291230 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.394125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.394217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.394228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.394255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.394274 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.496834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.496956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.496987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.497021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.497063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.599389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.599460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.599477 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.599503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.599522 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.702442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.702501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.702513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.702538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.702553 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.806211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.806282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.806299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.806324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.806344 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.910146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.910219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.910236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.910263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:04 crc kubenswrapper[4909]: I1201 10:32:04.910285 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:04Z","lastTransitionTime":"2025-12-01T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.013766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.013930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.013967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.013999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.014023 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.096629 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:05 crc kubenswrapper[4909]: E1201 10:32:05.096978 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:05 crc kubenswrapper[4909]: E1201 10:32:05.097106 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:32:13.097066564 +0000 UTC m=+50.331537492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.118117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.118186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.118211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.118246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.118270 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.221564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.221849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.221964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.221999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.222023 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.256388 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:05 crc kubenswrapper[4909]: E1201 10:32:05.256634 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.324560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.324598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.324607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.324623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.324632 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.428189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.428263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.428291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.428323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.428348 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.532052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.532118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.532161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.532189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.532208 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.636971 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.637003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.637013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.637029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.637039 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.740304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.740406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.740435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.740469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.740495 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.843416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.843474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.843486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.843506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.843519 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.946735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.946803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.946820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.946843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:05 crc kubenswrapper[4909]: I1201 10:32:05.946856 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:05Z","lastTransitionTime":"2025-12-01T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.051000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.051064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.051091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.051117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.051133 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.156276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.156350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.156364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.156389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.156403 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.256429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.256478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.256630 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.256675 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.256773 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.256933 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.258238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.258283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.258295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.258310 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.258323 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.361806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.361848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.361862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.361909 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.361925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.464573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.464622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.464637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.464656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.464669 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.509459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.509494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.509503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.509517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.509527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.530540 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:06Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.539149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.539270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.539284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.539308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.539327 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.557476 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:06Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.562352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.562393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.562402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.562420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.562433 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.574459 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:06Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.578993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.579077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.579096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.579129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.579155 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.593107 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:06Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.597243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.597325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.597345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.597375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.597394 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.611291 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:06Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:06 crc kubenswrapper[4909]: E1201 10:32:06.611422 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.613720 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.613757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.613770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.613788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.613801 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.716988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.717091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.717117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.717157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.717187 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.820799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.820896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.820913 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.820939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.820955 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.923532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.923587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.923603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.923630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:06 crc kubenswrapper[4909]: I1201 10:32:06.923645 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:06Z","lastTransitionTime":"2025-12-01T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.026698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.026748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.026762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.026783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.026801 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.128596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.128636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.128645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.128660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.128672 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.230961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.231008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.231020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.231036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.231046 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.257121 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:07 crc kubenswrapper[4909]: E1201 10:32:07.257362 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.334400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.334444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.334453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.334470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.334480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.437570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.437657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.437677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.437708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.437728 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.541225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.541303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.541322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.541357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.541378 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.645410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.645452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.645464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.645481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.645492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.747994 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.748072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.748090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.748123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.748145 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.852213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.852306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.852334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.852371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.852397 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.957088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.957135 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.957147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.957165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:07 crc kubenswrapper[4909]: I1201 10:32:07.957181 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:07Z","lastTransitionTime":"2025-12-01T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.060443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.060962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.061580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.061966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.062293 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.166815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.167377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.167584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.167844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.168093 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.257272 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:08 crc kubenswrapper[4909]: E1201 10:32:08.257466 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.258320 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.258448 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:08 crc kubenswrapper[4909]: E1201 10:32:08.258485 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:08 crc kubenswrapper[4909]: E1201 10:32:08.258959 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.259344 4909 scope.go:117] "RemoveContainer" containerID="99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.278506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.278904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.278920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.278944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.278958 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.383091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.383156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.383173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.383200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.383220 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.486636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.486837 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.486859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.487421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.487484 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.589937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.590060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.590123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.590157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.590215 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.592195 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/1.log" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.597839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.598006 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.616943 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.631680 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.649993 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.669001 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.686927 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.693313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.693368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.693378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.693398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.693408 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.707016 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.728374 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.744456 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.769276 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.788613 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.796647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.796682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.796695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.796714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.796728 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.806526 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.817623 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.837321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.852647 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.867083 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.879645 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.900242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.900324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.900343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.900381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.900400 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:08Z","lastTransitionTime":"2025-12-01T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:08 crc kubenswrapper[4909]: I1201 10:32:08.903052 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:08Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.003757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.003903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.003917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.003933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.003943 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.106444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.106487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.106504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.106524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.106537 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.210242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.210299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.210314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.210336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.210351 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.256847 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:09 crc kubenswrapper[4909]: E1201 10:32:09.257095 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.315040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.315133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.315156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.315185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.315204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.418268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.418322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.418335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.418356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.418368 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.521684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.521732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.521749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.521774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.521793 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.603941 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/2.log" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.604939 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/1.log" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.607846 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" exitCode=1 Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.607920 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.607993 4909 scope.go:117] "RemoveContainer" containerID="99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.609063 4909 scope.go:117] "RemoveContainer" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" Dec 01 10:32:09 crc kubenswrapper[4909]: E1201 10:32:09.609318 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.624705 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.624944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.625001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.625017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.625046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.625064 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.642280 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.655392 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.671131 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.687127 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.702796 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.727968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.728047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.728068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.728101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.728122 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.728386 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99212f861c63fc5bb1e7ebbcd954dc52ed229322b291de123c7007421fa1d36a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:31:53Z\\\",\\\"message\\\":\\\"ated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 10:31:53.347088 6359 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-network-console/networking-console-plugin]} name:Service_openshift-network-console/networking-console-plugin_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.246:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ab0b1d51-5ec6-479b-8881-93dfa8d30337}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.747578 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.776263 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.794596 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.813864 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.830778 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.831573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.831644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.831662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.831684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.831702 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.849220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.870617 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.891072 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.913946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.930830 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:09Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.934573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.934615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.934627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.934644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:09 crc kubenswrapper[4909]: I1201 10:32:09.934655 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:09Z","lastTransitionTime":"2025-12-01T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.038117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.038171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.038187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.038211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.038228 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.140688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.140726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.140738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.140755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.140767 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.243214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.243258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.243271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.243291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.243310 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.257073 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.257226 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:10 crc kubenswrapper[4909]: E1201 10:32:10.257272 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.257323 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:10 crc kubenswrapper[4909]: E1201 10:32:10.257476 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:10 crc kubenswrapper[4909]: E1201 10:32:10.257657 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.347046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.347122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.347144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.347173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.347192 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.450741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.450786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.450805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.450834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.450854 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.555700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.555773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.555796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.555829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.555853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.617862 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/2.log" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.659136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.659184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.659202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.659237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.659263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.763427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.763501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.763521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.763554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.763576 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.867258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.867452 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.867486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.867801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.867844 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.971471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.971556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.971585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.971619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:10 crc kubenswrapper[4909]: I1201 10:32:10.971645 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:10Z","lastTransitionTime":"2025-12-01T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.010665 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.014343 4909 scope.go:117] "RemoveContainer" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" Dec 01 10:32:11 crc kubenswrapper[4909]: E1201 10:32:11.016154 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.035818 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.059723 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.075471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.075586 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.075606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.075634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.075653 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.083504 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.111816 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.137354 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.155715 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.178752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.178855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.178887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.178911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.178924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.180503 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.201398 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.225187 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.241536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.257073 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:11 crc kubenswrapper[4909]: E1201 10:32:11.257354 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.268377 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.282924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.283001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.283026 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.283060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.283084 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.285570 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.307330 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.325520 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.346561 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.362610 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.379465 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:11Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.392186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.392852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.392894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.392928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.392948 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.496405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.496487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.496508 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.496538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.496558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.600121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.600222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.600250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.600288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.600315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.704164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.704256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.704279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.704311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.704333 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.807681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.807760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.807780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.807808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.807831 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.910723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.910777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.910792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.910812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:11 crc kubenswrapper[4909]: I1201 10:32:11.910827 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:11Z","lastTransitionTime":"2025-12-01T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.014224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.014277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.014289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.014311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.014324 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.117981 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.118055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.118069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.118098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.118116 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.222106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.222204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.222224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.222258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.222283 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.256331 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.256453 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.256370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:12 crc kubenswrapper[4909]: E1201 10:32:12.256590 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:12 crc kubenswrapper[4909]: E1201 10:32:12.256755 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:12 crc kubenswrapper[4909]: E1201 10:32:12.256928 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.326422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.326479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.326499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.326523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.326544 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.429774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.429848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.429867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.429946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.429977 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.532628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.532702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.532713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.532731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.532744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.635353 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.635416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.635429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.635450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.635467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.738280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.738320 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.738328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.738365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.738376 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.841545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.841589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.841598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.841618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.841629 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.944752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.944802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.944811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.944833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:12 crc kubenswrapper[4909]: I1201 10:32:12.944853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:12Z","lastTransitionTime":"2025-12-01T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.048349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.048397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.048410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.048435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.048448 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.152390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.152456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.152472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.152495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.152508 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.192525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.192756 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.192850 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:32:29.192829423 +0000 UTC m=+66.427300321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.254901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.254934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.254943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.254958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.254968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.256395 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.256588 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.278781 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.294236 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.306518 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.322495 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.348257 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.357352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.357435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.357466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.357547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.357582 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.366151 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.385821 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.398318 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.416624 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.432152 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.453319 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.460638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.460692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.460701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.460717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.460726 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.467277 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.480387 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496131 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496271 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496320 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496244 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.496372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496489 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496508 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496519 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496585 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496681 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496608 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:45.496575765 +0000 UTC m=+82.731046663 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496702 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496719 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496742 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:45.49671244 +0000 UTC m=+82.731183348 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496765 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:45.496754282 +0000 UTC m=+82.731225190 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496782 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496818 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:32:45.496801073 +0000 UTC m=+82.731271971 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:13 crc kubenswrapper[4909]: E1201 10:32:13.496927 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:32:45.496913237 +0000 UTC m=+82.731384355 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.508815 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.521712 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.533131 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.563839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.563917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.563931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.563952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.563965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.667561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.667631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.667649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.667677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.667699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.771368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.771404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.771414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.771432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.771441 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.874072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.874116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.874126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.874145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.874155 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.977937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.978000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.978022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.978051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:13 crc kubenswrapper[4909]: I1201 10:32:13.978072 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:13Z","lastTransitionTime":"2025-12-01T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.082091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.082161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.082182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.082212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.082229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.186101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.186178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.186222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.186260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.186285 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.256343 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.256392 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.256414 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:14 crc kubenswrapper[4909]: E1201 10:32:14.256505 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:14 crc kubenswrapper[4909]: E1201 10:32:14.256611 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:14 crc kubenswrapper[4909]: E1201 10:32:14.256703 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.290109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.290151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.290161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.290178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.290189 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.392933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.393346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.393574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.393701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.393983 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.497309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.497361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.497373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.497391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.497401 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.600414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.600470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.600482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.600498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.600508 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.702974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.703034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.703050 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.703072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.703086 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.806095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.806167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.806205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.806256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.806281 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.910650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.910725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.910776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.910814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:14 crc kubenswrapper[4909]: I1201 10:32:14.910841 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:14Z","lastTransitionTime":"2025-12-01T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.013802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.013852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.013862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.013903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.013915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.116244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.116281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.116290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.116307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.116317 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.219071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.219157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.219183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.219220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.219246 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.256568 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:15 crc kubenswrapper[4909]: E1201 10:32:15.256813 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.321632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.321695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.321714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.321741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.321760 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.424429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.424486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.424496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.424544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.424555 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.493345 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.509889 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.510096 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.527841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.528161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.528177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.528191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.528201 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.528317 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.545185 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.569032 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.587368 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.608813 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.626439 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.631757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.631812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.631945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.632026 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.632048 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.663499 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.679901 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.703342 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.725178 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.734129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.734299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.734626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.734904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.735135 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.758275 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.786466 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.804259 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.826206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.838992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.839193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.839333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.839492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.839674 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.844203 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.862784 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.942731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.942826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.942847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.942936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:15 crc kubenswrapper[4909]: I1201 10:32:15.942966 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:15Z","lastTransitionTime":"2025-12-01T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.046214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.046271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.046288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.046314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.046337 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.149619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.149677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.149692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.149712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.149725 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.253025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.253078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.253095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.253119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.253136 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.256642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.256816 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.257038 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.257386 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.257536 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.257908 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.356561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.356622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.356639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.356667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.356687 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.460268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.460359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.460386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.460419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.460441 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.564563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.564637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.564665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.564699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.564724 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.619712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.620060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.620115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.620148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.620180 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.637675 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.643126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.643178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.643190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.643209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.643222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.661694 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.667351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.667433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.667460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.667498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.667524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.685982 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.692031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.692089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.692098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.692118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.692129 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.713505 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.719004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.719053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.719062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.719086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.719099 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.737217 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:16Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:16 crc kubenswrapper[4909]: E1201 10:32:16.737572 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.739519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.739580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.739592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.739610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.739621 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.843095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.843373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.843435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.843502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.843559 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.947029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.947314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.947448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.947544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:16 crc kubenswrapper[4909]: I1201 10:32:16.947637 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:16Z","lastTransitionTime":"2025-12-01T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.050737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.050803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.050820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.050845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.050863 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.154224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.154295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.154319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.154350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.154371 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.256420 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:17 crc kubenswrapper[4909]: E1201 10:32:17.256683 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.258226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.258351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.258376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.258409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.258432 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.360990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.361028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.361041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.361058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.361068 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.463616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.463671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.463680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.463700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.463712 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.566314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.566352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.566360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.566375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.566384 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.668939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.668986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.669000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.669019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.669030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.771425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.771566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.771594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.771633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.771661 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.874813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.874910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.874927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.874953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.874970 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.977729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.977794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.977813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.977864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:17 crc kubenswrapper[4909]: I1201 10:32:17.977914 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:17Z","lastTransitionTime":"2025-12-01T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.081116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.081153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.081162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.081179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.081191 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.184086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.184161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.184175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.184208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.184222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.256246 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.256302 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.256297 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:18 crc kubenswrapper[4909]: E1201 10:32:18.257599 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:18 crc kubenswrapper[4909]: E1201 10:32:18.257750 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:18 crc kubenswrapper[4909]: E1201 10:32:18.257927 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.287531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.287627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.287655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.287695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.287725 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.391201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.391279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.391298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.391336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.391382 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.495192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.495258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.495269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.495289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.495307 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.598379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.598440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.598454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.598478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.598492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.701405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.701442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.701454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.701473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.701485 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.805016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.805518 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.805683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.805847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.806037 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.910610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.910657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.910673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.910694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:18 crc kubenswrapper[4909]: I1201 10:32:18.910707 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:18Z","lastTransitionTime":"2025-12-01T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.014780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.014849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.014868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.014928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.014947 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.118011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.118065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.118078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.118101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.118117 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.221445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.221515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.221525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.221546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.221558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.256474 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:19 crc kubenswrapper[4909]: E1201 10:32:19.256767 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.325004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.325049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.325058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.325076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.325089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.428023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.428410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.428484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.428553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.428631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.531621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.531666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.531677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.531696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.531705 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.634238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.634286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.634297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.634344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.634358 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.737402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.737480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.737501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.737532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.737557 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.841995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.842470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.842672 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.842967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.843216 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.947152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.947720 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.948113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.948309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:19 crc kubenswrapper[4909]: I1201 10:32:19.948445 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:19Z","lastTransitionTime":"2025-12-01T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.051918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.052376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.052549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.052665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.052763 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.156416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.156461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.156471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.156493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.156503 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.257141 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.257280 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.257178 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:20 crc kubenswrapper[4909]: E1201 10:32:20.257463 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:20 crc kubenswrapper[4909]: E1201 10:32:20.257828 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:20 crc kubenswrapper[4909]: E1201 10:32:20.258064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.259830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.259890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.259904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.259923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.259936 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.363624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.363929 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.364028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.364147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.364228 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.467109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.467383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.467478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.467574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.467638 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.572135 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.572479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.572553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.572634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.572699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.675679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.675735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.675752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.675775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.675794 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.778967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.779031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.779047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.779071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.779089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.881947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.882027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.882046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.882072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.882089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.985329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.985401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.985416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.985440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:20 crc kubenswrapper[4909]: I1201 10:32:20.985465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:20Z","lastTransitionTime":"2025-12-01T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.088676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.088741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.088759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.088785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.088807 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.192758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.192824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.192844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.192898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.192916 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.256762 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:21 crc kubenswrapper[4909]: E1201 10:32:21.257139 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.296284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.296858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.297018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.297105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.297170 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.401706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.401786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.401803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.401843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.401863 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.504718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.505066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.505136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.505204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.505263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.609937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.610009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.610027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.610052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.610071 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.712271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.712319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.712328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.712345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.712354 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.815105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.815172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.815186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.815216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.815233 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.918086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.918521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.918723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.918974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:21 crc kubenswrapper[4909]: I1201 10:32:21.919181 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:21Z","lastTransitionTime":"2025-12-01T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.023235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.023763 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.023942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.024106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.024444 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.128063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.128164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.128182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.128211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.128230 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.231516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.231577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.231587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.231609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.231619 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.257380 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.257512 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.257380 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:22 crc kubenswrapper[4909]: E1201 10:32:22.257581 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:22 crc kubenswrapper[4909]: E1201 10:32:22.257747 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:22 crc kubenswrapper[4909]: E1201 10:32:22.258008 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.335259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.335373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.335391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.335424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.335446 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.438688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.438730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.438767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.438787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.438800 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.542087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.542138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.542151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.542169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.542186 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.645570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.645620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.645636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.645658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.645676 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.748348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.748398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.748412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.748430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.748442 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.852001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.852074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.852086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.852104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.852115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.956019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.956556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.956731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.956933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:22 crc kubenswrapper[4909]: I1201 10:32:22.957132 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:22Z","lastTransitionTime":"2025-12-01T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.061100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.061224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.061250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.061280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.061301 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.164988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.165065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.165088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.165119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.165145 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.256760 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:23 crc kubenswrapper[4909]: E1201 10:32:23.257025 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.257937 4909 scope.go:117] "RemoveContainer" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" Dec 01 10:32:23 crc kubenswrapper[4909]: E1201 10:32:23.258188 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.268600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.268662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.268680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.268698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.268722 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.280859 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.298794 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.317932 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.352514 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.370077 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.371456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.371513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.371527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.371551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.371572 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.384314 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.399662 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.423120 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.438446 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.457246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.472163 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.475967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.476005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.476018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.476038 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.476052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.490229 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.503103 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.515768 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.527181 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.537987 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.547898 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.560779 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.579024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.579083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.579104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.579128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.579144 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.682113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.682154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.682164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.682181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.682191 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.785348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.785930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.786096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.786242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.786375 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.889632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.889677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.889690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.889709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.889720 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.992505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.992553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.992563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.992577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:23 crc kubenswrapper[4909]: I1201 10:32:23.992587 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:23Z","lastTransitionTime":"2025-12-01T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.095153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.095216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.095227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.095249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.095261 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.198917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.198998 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.199017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.199049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.199069 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.256429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.256429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.256435 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:24 crc kubenswrapper[4909]: E1201 10:32:24.256797 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:24 crc kubenswrapper[4909]: E1201 10:32:24.256619 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:24 crc kubenswrapper[4909]: E1201 10:32:24.256839 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.302578 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.302625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.302636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.302652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.302664 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.405666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.405724 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.405735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.405751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.405764 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.508807 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.508925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.508943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.508973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.508994 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.612383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.612471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.612495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.612528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.612550 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.716288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.716357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.716377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.716408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.716432 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.820385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.820459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.820480 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.820511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.820535 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.923365 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.923436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.923455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.923489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:24 crc kubenswrapper[4909]: I1201 10:32:24.923510 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:24Z","lastTransitionTime":"2025-12-01T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.025970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.026012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.026025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.026044 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.026055 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.128546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.128669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.128689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.128717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.128742 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.231915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.231953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.231992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.232012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.232024 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.257045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:25 crc kubenswrapper[4909]: E1201 10:32:25.257300 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.334803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.334934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.334969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.335005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.335030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.438814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.438940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.438967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.439001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.439025 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.543109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.543178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.543193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.543215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.543229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.646094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.646164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.646186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.646217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.646237 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.749494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.749537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.749576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.749596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.749630 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.852689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.852762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.852783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.852815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.852844 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.954786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.954824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.954834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.954847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:25 crc kubenswrapper[4909]: I1201 10:32:25.954856 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:25Z","lastTransitionTime":"2025-12-01T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.057864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.057945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.057956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.057978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.057995 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.161373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.161460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.161478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.161504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.161530 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.256885 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.256980 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.257056 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:26 crc kubenswrapper[4909]: E1201 10:32:26.257049 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:26 crc kubenswrapper[4909]: E1201 10:32:26.257149 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:26 crc kubenswrapper[4909]: E1201 10:32:26.257414 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.264782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.264828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.264836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.264854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.264864 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.368296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.368351 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.368367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.368390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.368408 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.472574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.472675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.472690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.472713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.472730 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.578376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.578462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.578479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.578503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.578525 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.682740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.682802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.682815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.682837 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.682849 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.786623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.786699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.786711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.786731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.786745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.890371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.890449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.890474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.890506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.890530 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.994216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.994487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.994573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.994642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:26 crc kubenswrapper[4909]: I1201 10:32:26.994700 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:26Z","lastTransitionTime":"2025-12-01T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.022544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.022846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.023017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.023129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.023248 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.043993 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.049717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.049745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.049756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.049773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.049785 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.072349 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.076240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.076270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.076283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.076301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.076312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.089086 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.092206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.092221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.092229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.092245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.092257 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.106718 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.111278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.111336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.111346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.111369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.111383 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.126398 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:27Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.126526 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.128704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.128737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.128748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.128767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.128780 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.231634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.231676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.231684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.231699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.231707 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.256478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:27 crc kubenswrapper[4909]: E1201 10:32:27.256618 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.335993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.336045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.336054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.336073 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.336087 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.439258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.439305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.439315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.439334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.439345 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.542330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.542420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.542445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.542476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.542496 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.645324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.645863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.645893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.645916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.645931 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.749606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.749678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.749689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.749708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.749718 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.852301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.852358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.852368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.852381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.852392 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.955426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.955498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.955506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.955522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:27 crc kubenswrapper[4909]: I1201 10:32:27.955532 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:27Z","lastTransitionTime":"2025-12-01T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.064989 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.065037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.065047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.065064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.065073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.167115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.167151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.167161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.167175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.167184 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.256466 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:28 crc kubenswrapper[4909]: E1201 10:32:28.256619 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.256689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:28 crc kubenswrapper[4909]: E1201 10:32:28.256733 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.257159 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:28 crc kubenswrapper[4909]: E1201 10:32:28.257378 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.269965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.270016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.270026 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.270052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.270063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.373219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.373258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.373267 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.373282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.373294 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.477195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.477257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.477276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.477305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.477324 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.579631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.579674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.579685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.579702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.579714 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.682233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.682274 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.682284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.682303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.682312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.784544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.784576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.784584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.784597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.784608 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.886925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.886969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.886981 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.887003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.887016 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.989772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.989848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.989859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.989896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:28 crc kubenswrapper[4909]: I1201 10:32:28.989909 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:28Z","lastTransitionTime":"2025-12-01T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.092382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.092428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.092438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.092453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.092464 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.196325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.196395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.196409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.196433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.196446 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.257219 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:29 crc kubenswrapper[4909]: E1201 10:32:29.257445 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.285195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:29 crc kubenswrapper[4909]: E1201 10:32:29.285406 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:29 crc kubenswrapper[4909]: E1201 10:32:29.285480 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:33:01.285458712 +0000 UTC m=+98.519929610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.298844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.298895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.298906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.298924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.298937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.402665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.402730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.402742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.402768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.402782 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.506223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.506277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.506286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.506305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.506316 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.608942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.608988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.608999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.609018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.609030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.714225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.714297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.714317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.714347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.714365 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.816918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.816974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.816992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.817017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.817031 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.921226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.921273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.921281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.921299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:29 crc kubenswrapper[4909]: I1201 10:32:29.921309 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:29Z","lastTransitionTime":"2025-12-01T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.024736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.024806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.024820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.024846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.024857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.127942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.127991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.127999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.128015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.128026 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.230836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.230922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.230935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.230956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.230968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.256925 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.257010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.257063 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:30 crc kubenswrapper[4909]: E1201 10:32:30.257032 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:30 crc kubenswrapper[4909]: E1201 10:32:30.257180 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:30 crc kubenswrapper[4909]: E1201 10:32:30.257287 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.332985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.333074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.333089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.333143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.333157 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.435635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.435692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.435711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.435736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.435754 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.538271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.538319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.538331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.538348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.538359 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.640715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.640802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.640821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.640853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.640924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.700947 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/0.log" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.700995 4909 generic.go:334] "Generic (PLEG): container finished" podID="89f06a94-5047-41d9-90a3-8433149d22c4" containerID="74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784" exitCode=1 Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.701024 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerDied","Data":"74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.701400 4909 scope.go:117] "RemoveContainer" containerID="74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.718587 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.737838 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.745646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.745719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.745737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.745786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.745804 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.751803 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.766858 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.785515 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.806503 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.825934 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.841483 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.848403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.848437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.848473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.848491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.848503 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.855264 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.881008 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.930626 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.951192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.951242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.951252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.951272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.951286 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:30Z","lastTransitionTime":"2025-12-01T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.971404 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.983822 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:30 crc kubenswrapper[4909]: I1201 10:32:30.996631 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:30Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.006222 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.017123 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.028027 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.041126 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.053894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.053921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.053931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.053948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.053961 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.157011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.157058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.157067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.157086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.157097 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.256795 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:31 crc kubenswrapper[4909]: E1201 10:32:31.257056 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.260066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.260114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.260132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.260152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.260167 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.362801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.362865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.362903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.362925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.362948 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.466108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.466150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.466163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.466180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.466192 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.568524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.568613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.568637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.568673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.568698 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.671727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.671779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.671795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.671815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.671827 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.707048 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/0.log" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.707119 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerStarted","Data":"73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.729228 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.744455 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.757957 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.774952 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.775146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.775218 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.775237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.775604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.775820 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.793928 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.811145 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.829497 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.848101 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.864494 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.878808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.878842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.878853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.878893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.878905 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.882741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.901358 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.917631 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.935894 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.953197 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.977497 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.981817 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.981857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.981889 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.981916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.981927 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:31Z","lastTransitionTime":"2025-12-01T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:31 crc kubenswrapper[4909]: I1201 10:32:31.992466 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:31Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.017284 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.032732 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:32Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.084441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.084502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.084518 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.084545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.084560 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.187688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.187739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.187754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.187782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.187794 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.257176 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.257214 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.257204 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:32 crc kubenswrapper[4909]: E1201 10:32:32.257328 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:32 crc kubenswrapper[4909]: E1201 10:32:32.257420 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:32 crc kubenswrapper[4909]: E1201 10:32:32.257482 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.269332 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.290346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.290404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.290420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.290445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.290459 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.393780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.393832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.393846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.393886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.393901 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.497526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.497574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.497584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.497601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.497612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.601324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.601375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.601385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.601404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.601415 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.704162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.704199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.704208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.704222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.704230 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.806614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.806680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.806694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.806724 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.806741 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.910208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.910323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.910342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.910404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:32 crc kubenswrapper[4909]: I1201 10:32:32.910424 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:32Z","lastTransitionTime":"2025-12-01T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.013050 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.013131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.013151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.013184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.013207 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.115425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.115474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.115485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.115500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.115512 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.218176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.218246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.218261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.218283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.218300 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.257276 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:33 crc kubenswrapper[4909]: E1201 10:32:33.257501 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.274965 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.300839 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.320138 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.322417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.322564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.322656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.322724 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.322810 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.341796 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.365220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.379417 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.392288 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.403960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.414444 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.426194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.426228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.426237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.426255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.426264 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.430743 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.449102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.461502 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.471292 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.492463 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.503889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.518493 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530121 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.530362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.548516 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.561535 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.632853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.632920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.632933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.632953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.632964 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.735489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.735535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.735543 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.735559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.735568 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.845475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.845549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.845561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.845579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.845596 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.948991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.949286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.949386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.949529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:33 crc kubenswrapper[4909]: I1201 10:32:33.949631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:33Z","lastTransitionTime":"2025-12-01T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.052786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.052847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.052864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.052920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.052942 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.155973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.156290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.156378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.156503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.156584 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.256244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:34 crc kubenswrapper[4909]: E1201 10:32:34.256367 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.256396 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.256505 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:34 crc kubenswrapper[4909]: E1201 10:32:34.256618 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:34 crc kubenswrapper[4909]: E1201 10:32:34.256741 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.258483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.258513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.258524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.258538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.258551 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.361419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.361464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.361473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.361488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.361498 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.463935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.463972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.463982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.464041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.464052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.567130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.567179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.567194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.567214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.567229 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.669938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.669997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.670009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.670025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.670052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.772321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.772360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.772370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.772385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.772396 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.875173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.875209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.875217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.875232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.875242 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.977325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.977367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.977377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.977391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:34 crc kubenswrapper[4909]: I1201 10:32:34.977403 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:34Z","lastTransitionTime":"2025-12-01T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.079823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.079885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.079898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.079914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.079924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.183004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.183049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.183059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.183074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.183085 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.257086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:35 crc kubenswrapper[4909]: E1201 10:32:35.257256 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.285148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.285193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.285206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.285226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.285238 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.388152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.388222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.388237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.388261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.388276 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.491679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.491759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.491777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.492492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.492637 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.595696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.595740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.595751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.595768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.595779 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.698191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.698250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.698260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.698276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.698288 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.801152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.801921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.802010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.802104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.802187 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.904602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.904650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.904659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.904677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:35 crc kubenswrapper[4909]: I1201 10:32:35.904686 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:35Z","lastTransitionTime":"2025-12-01T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.006751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.006799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.006809 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.006827 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.006842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.109538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.109582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.109591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.109607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.109616 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.212462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.212520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.212533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.212552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.212563 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.256559 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:36 crc kubenswrapper[4909]: E1201 10:32:36.256727 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.256967 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:36 crc kubenswrapper[4909]: E1201 10:32:36.257020 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.256565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:36 crc kubenswrapper[4909]: E1201 10:32:36.257156 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.257458 4909 scope.go:117] "RemoveContainer" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.315114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.315168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.315176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.315190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.315201 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.418056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.418108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.418119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.418134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.418156 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.520921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.521229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.521319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.521407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.521524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.623422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.623449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.623459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.623474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.623484 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.722860 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/2.log" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.725240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.725269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.725280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.725295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.725308 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.726554 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.727036 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.742644 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.756789 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.773917 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.790839 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.810806 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.828450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.828494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.828506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.828524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.828533 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.831183 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.852048 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.865557 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.876588 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.895200 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.906380 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.918230 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.929493 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.930853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.930892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.930902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.930917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.930928 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:36Z","lastTransitionTime":"2025-12-01T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.940276 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.951075 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.964652 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.976282 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.988610 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:36 crc kubenswrapper[4909]: I1201 10:32:36.997964 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:36Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.033775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.033814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.033824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.033841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.033853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.136511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.136547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.136555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.136569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.136577 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.238612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.238646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.238655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.238671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.238681 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.258308 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.258439 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.340732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.340765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.340773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.340787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.340816 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.443921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.443986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.444006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.444028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.444041 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.526646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.526678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.526689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.526702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.526711 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.541227 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.545526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.545560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.545572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.545593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.545605 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.560868 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.565643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.565701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.565714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.565733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.565745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.581760 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.589619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.589687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.589704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.589730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.589745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.605578 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.609512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.609552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.609568 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.609590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.609607 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.625342 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.625516 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.627340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.627408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.627426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.627447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.627462 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.729405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.729442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.729453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.729470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.729480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.732220 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/3.log" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.732915 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/2.log" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.736269 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" exitCode=1 Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.736335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.736389 4909 scope.go:117] "RemoveContainer" containerID="a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.737022 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:32:37 crc kubenswrapper[4909]: E1201 10:32:37.737247 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.753400 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.765820 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.779893 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.795565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.808494 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.820013 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831240 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.831908 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.848952 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.861511 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.877438 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.890650 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.901994 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.922379 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6208649d120546e30dbf18199a419b0d73fbe0245cdcb9382f7292ec6bedce9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:09Z\\\",\\\"message\\\":\\\"sion-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 10:32:09.164050 6569 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/no\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"/node-ca-qggws in node crc\\\\nI1201 10:32:37.215721 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-qggws after 0 failed attempt(s)\\\\nI1201 10:32:37.215727 6920 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-qggws\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-2qpdc after 0 failed attempt(s)\\\\nI1201 10:32:37.215737 6920 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-2qpdc\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1201 10:32:37.215744 6920 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF1201 10:32:37.214392 6920 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.933688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.933736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.933750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.933769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.933783 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:37Z","lastTransitionTime":"2025-12-01T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.935803 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.947228 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.967153 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.980149 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:37 crc kubenswrapper[4909]: I1201 10:32:37.994555 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:37Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.007399 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.036188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.036229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.036242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.036258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.036270 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.138386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.138429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.138438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.138453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.138463 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.241332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.241628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.241711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.241806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.241957 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.256747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.256759 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:38 crc kubenswrapper[4909]: E1201 10:32:38.257009 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.256847 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:38 crc kubenswrapper[4909]: E1201 10:32:38.257498 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:38 crc kubenswrapper[4909]: E1201 10:32:38.257185 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.344712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.345039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.345162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.345292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.345402 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.454632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.454707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.454726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.454752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.454770 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.558139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.558197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.558214 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.558240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.558258 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.662286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.662367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.662385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.662411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.662431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.742945 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/3.log" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.748865 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:32:38 crc kubenswrapper[4909]: E1201 10:32:38.749383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.765829 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.767779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.767957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.768083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.768204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.768326 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.803482 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.824581 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.849369 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.871955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.872328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.872441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.872634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.872745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.872193 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.896942 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"/node-ca-qggws in node crc\\\\nI1201 10:32:37.215721 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-qggws after 0 failed attempt(s)\\\\nI1201 10:32:37.215727 6920 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-qggws\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-2qpdc after 0 failed attempt(s)\\\\nI1201 10:32:37.215737 6920 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-2qpdc\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1201 10:32:37.215744 6920 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF1201 10:32:37.214392 6920 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.915035 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.931679 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.952773 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.968782 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.975525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.975581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.975594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.975616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.975630 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:38Z","lastTransitionTime":"2025-12-01T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.985058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:38 crc kubenswrapper[4909]: I1201 10:32:38.998697 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:38Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.016249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.028512 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.041851 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.052639 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.066249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.078447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.078502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.078515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.078537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.078551 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.082262 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.094353 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:39Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.181212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.181275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.181292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.181316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.181335 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.257215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:39 crc kubenswrapper[4909]: E1201 10:32:39.257367 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.283820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.283867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.283904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.283925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.283937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.386767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.386816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.386825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.386841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.386853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.489333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.489370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.489379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.489393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.489402 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.591526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.591592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.591985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.592053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.592077 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.695235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.695283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.695296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.695313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.695323 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.797007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.797056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.797074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.797095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.797110 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.899476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.899544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.899558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.899575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:39 crc kubenswrapper[4909]: I1201 10:32:39.899586 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:39Z","lastTransitionTime":"2025-12-01T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.001847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.001955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.001979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.002010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.002030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.105116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.105168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.105184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.105207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.105223 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.208041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.208087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.208098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.208115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.208128 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.256581 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.256618 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:40 crc kubenswrapper[4909]: E1201 10:32:40.256714 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:40 crc kubenswrapper[4909]: E1201 10:32:40.256833 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.257155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:40 crc kubenswrapper[4909]: E1201 10:32:40.257291 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.310957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.310986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.310995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.311008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.311016 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.413417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.413459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.413491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.413509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.413522 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.515881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.515911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.515923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.515949 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.515960 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.618053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.618107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.618119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.618138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.618151 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.720630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.720676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.720688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.720706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.720722 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.823854 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.823953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.823965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.823983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.823995 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.927086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.927121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.927133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.927150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:40 crc kubenswrapper[4909]: I1201 10:32:40.927161 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:40Z","lastTransitionTime":"2025-12-01T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.030054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.030141 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.030154 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.030192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.030206 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.133229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.133268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.133277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.133291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.133301 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.235019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.235060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.235069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.235082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.235092 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.256795 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:41 crc kubenswrapper[4909]: E1201 10:32:41.256919 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.337758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.337802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.337813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.337828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.337839 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.441157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.441232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.441260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.441291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.441313 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.544003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.544068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.544094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.544122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.544145 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.646949 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.647032 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.647053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.647077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.647097 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.749893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.749958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.749980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.750006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.750023 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.852931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.852982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.853000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.853018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.853031 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.955975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.956020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.956033 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.956048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:41 crc kubenswrapper[4909]: I1201 10:32:41.956059 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:41Z","lastTransitionTime":"2025-12-01T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.059153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.059221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.059239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.059262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.059277 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.161161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.161215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.161227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.161247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.161260 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.256151 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.256211 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:42 crc kubenswrapper[4909]: E1201 10:32:42.256284 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.256211 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:42 crc kubenswrapper[4909]: E1201 10:32:42.256352 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:42 crc kubenswrapper[4909]: E1201 10:32:42.256470 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.264160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.264201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.264210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.264226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.264236 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.366692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.366748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.366769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.366792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.366808 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.469069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.469114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.469123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.469162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.469173 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.571962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.572004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.572012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.572027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.572035 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.675497 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.675570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.675589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.675614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.675630 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.777828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.777935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.777953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.777975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.777990 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.880041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.880076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.880085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.880098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.880107 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.982914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.982966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.982986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.983010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:42 crc kubenswrapper[4909]: I1201 10:32:42.983028 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:42Z","lastTransitionTime":"2025-12-01T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.086540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.086628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.086652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.086685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.086732 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.189585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.189630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.189639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.189654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.189665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.256948 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:43 crc kubenswrapper[4909]: E1201 10:32:43.257344 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.274419 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.289137 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.293188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.293254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.293268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.293287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.293329 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.303183 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.325741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.338830 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.361704 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.378868 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.393080 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.396843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.396918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.396933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.396954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.396966 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.406662 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.423517 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.438958 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.451966 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.460671 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.477808 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.493075 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.499553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.499597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.499612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.499634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.499646 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.510510 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.525145 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.545085 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"/node-ca-qggws in node crc\\\\nI1201 10:32:37.215721 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-qggws after 0 failed attempt(s)\\\\nI1201 10:32:37.215727 6920 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-qggws\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-2qpdc after 0 failed attempt(s)\\\\nI1201 10:32:37.215737 6920 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-2qpdc\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1201 10:32:37.215744 6920 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF1201 10:32:37.214392 6920 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.557076 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.602349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.602383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.602391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.602406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.602416 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.704491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.704547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.704563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.704587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.704604 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.807997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.808085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.808109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.808140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.808163 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.910242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.910305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.910319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.910345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:43 crc kubenswrapper[4909]: I1201 10:32:43.910358 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:43Z","lastTransitionTime":"2025-12-01T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.013246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.013292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.013302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.013319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.013329 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.116405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.116455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.116469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.116490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.116507 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.219823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.219893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.219907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.219950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.219965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.256806 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.256827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:44 crc kubenswrapper[4909]: E1201 10:32:44.257079 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.256855 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:44 crc kubenswrapper[4909]: E1201 10:32:44.257316 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:44 crc kubenswrapper[4909]: E1201 10:32:44.257590 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.323697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.323765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.323775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.323792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.323808 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.426147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.426195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.426205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.426222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.426233 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.529118 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.529199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.529217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.529247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.529268 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.632615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.632682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.632693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.632711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.632723 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.734524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.734601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.734619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.734644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.734662 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.837388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.837438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.837447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.837462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.837472 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.940112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.940148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.940159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.940178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:44 crc kubenswrapper[4909]: I1201 10:32:44.940190 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:44Z","lastTransitionTime":"2025-12-01T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.042011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.042050 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.042059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.042073 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.042081 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.143698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.143738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.143747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.143763 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.143772 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.245976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.246023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.246035 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.246053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.246065 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.256403 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.256562 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.348640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.348680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.348692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.348707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.348719 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.450813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.450857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.450868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.450917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.450941 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.553125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.553191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.553202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.553221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.553231 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.584013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584136 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:49.584111435 +0000 UTC m=+146.818582333 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.584176 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.584215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.584251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.584282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584305 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584352 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:49.584340223 +0000 UTC m=+146.818811121 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584370 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584390 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584404 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584423 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584437 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:49.584427126 +0000 UTC m=+146.818898034 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584372 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584457 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:49.584447616 +0000 UTC m=+146.818918514 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584459 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584471 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:45 crc kubenswrapper[4909]: E1201 10:32:45.584495 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:33:49.584487577 +0000 UTC m=+146.818958475 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.655617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.655659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.655675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.655692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.655705 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.758538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.758618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.758634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.758658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.758680 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.861901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.861962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.861979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.862005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.862026 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.965444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.965499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.965509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.965526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:45 crc kubenswrapper[4909]: I1201 10:32:45.965536 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:45Z","lastTransitionTime":"2025-12-01T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.070168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.070217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.070227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.070243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.070254 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.173777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.173888 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.173906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.173933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.173949 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.256812 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.256831 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.256972 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:46 crc kubenswrapper[4909]: E1201 10:32:46.256990 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:46 crc kubenswrapper[4909]: E1201 10:32:46.257096 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:46 crc kubenswrapper[4909]: E1201 10:32:46.257276 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.276414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.276470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.276483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.276498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.276510 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.380138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.380211 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.380234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.380271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.380294 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.483143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.483217 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.483240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.483269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.483291 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.585746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.585790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.585804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.585823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.585834 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.689389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.689477 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.689501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.689532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.689558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.792201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.792249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.792261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.792280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.792292 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.895092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.895161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.895180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.895204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.895222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.998620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.998744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.998758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.998780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:46 crc kubenswrapper[4909]: I1201 10:32:46.998796 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:46Z","lastTransitionTime":"2025-12-01T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.101472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.101539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.101555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.101581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.101600 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.204024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.204134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.204148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.204168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.204179 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.257254 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.257526 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.306178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.306240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.306253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.306272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.306288 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.408931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.408999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.409015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.409049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.409063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.511127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.511182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.511194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.511212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.511223 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.613941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.613992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.614003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.614022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.614033 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.645387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.645447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.645468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.645489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.645504 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.668239 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.671802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.671842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.671861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.671908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.671919 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.685770 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.689110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.689144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.689164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.689184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.689205 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.702102 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.705855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.705916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.705925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.705941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.705954 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.719283 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.723262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.723296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.723309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.723330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.723343 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.740769 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:47 crc kubenswrapper[4909]: E1201 10:32:47.740893 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.742131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.742173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.742189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.742205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.742216 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.844660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.844716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.844728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.844744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.844754 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.946822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.946863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.946894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.946912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:47 crc kubenswrapper[4909]: I1201 10:32:47.946924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:47Z","lastTransitionTime":"2025-12-01T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.048982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.049041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.049058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.049081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.049096 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.151377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.151426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.151439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.151459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.151471 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.254291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.254335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.254348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.254366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.254377 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.256747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.256759 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.256746 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:48 crc kubenswrapper[4909]: E1201 10:32:48.256867 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:48 crc kubenswrapper[4909]: E1201 10:32:48.257014 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:48 crc kubenswrapper[4909]: E1201 10:32:48.257061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.356630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.356679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.356727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.356744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.356756 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.458744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.458782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.458790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.458805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.458815 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.561225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.561459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.561534 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.561608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.561669 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.663655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.663695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.663705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.663718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.663727 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.766423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.766461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.766472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.766490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.766503 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.868493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.868555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.868573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.868597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.868615 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.970983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.971028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.971039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.971057 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:48 crc kubenswrapper[4909]: I1201 10:32:48.971069 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:48Z","lastTransitionTime":"2025-12-01T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.073834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.073909 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.073921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.073936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.073949 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.176865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.176936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.176950 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.176968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.176979 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.257288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:49 crc kubenswrapper[4909]: E1201 10:32:49.257499 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.279429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.279470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.279479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.279493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.279502 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.381647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.381704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.381717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.381731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.381742 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.483402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.483451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.483464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.483479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.483490 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.585839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.585900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.585913 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.585929 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.585940 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.688655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.688701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.688710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.688725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.688734 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.790721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.790781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.790791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.790806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.790839 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.894079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.894137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.894155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.894179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.894196 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.996602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.996630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.996638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.996651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:49 crc kubenswrapper[4909]: I1201 10:32:49.996659 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:49Z","lastTransitionTime":"2025-12-01T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.098857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.098900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.098907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.098919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.098927 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.201812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.201896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.201908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.201923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.201933 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.256653 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.256683 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:50 crc kubenswrapper[4909]: E1201 10:32:50.256793 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.256925 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:50 crc kubenswrapper[4909]: E1201 10:32:50.257023 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:50 crc kubenswrapper[4909]: E1201 10:32:50.257090 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.304715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.304772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.304783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.304822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.304833 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.407247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.407315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.407333 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.407357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.407372 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.510100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.510147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.510159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.510176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.510189 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.613198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.613263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.613274 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.613293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.613305 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.722632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.722690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.722708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.722732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.722750 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.825752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.826342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.826518 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.826680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.826818 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.929825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.930177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.930408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.930630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:50 crc kubenswrapper[4909]: I1201 10:32:50.930817 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:50Z","lastTransitionTime":"2025-12-01T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.033175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.033218 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.033227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.033242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.033252 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.136511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.136550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.136589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.136631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.136641 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.238946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.239742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.239965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.240261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.240426 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.256984 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:51 crc kubenswrapper[4909]: E1201 10:32:51.257163 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.344316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.344366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.344377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.344394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.344405 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.446206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.446258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.446276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.446297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.446312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.549014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.549472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.549647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.549806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.550005 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.653554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.653608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.653626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.653649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.653666 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.757140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.757207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.757225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.757254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.757275 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.860475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.860520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.860529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.860544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.860555 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.963690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.963771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.963804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.963833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:51 crc kubenswrapper[4909]: I1201 10:32:51.963858 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:51Z","lastTransitionTime":"2025-12-01T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.067066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.068117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.068161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.068187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.068206 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.171040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.171145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.171178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.171216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.171254 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.256716 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.256852 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.256965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:52 crc kubenswrapper[4909]: E1201 10:32:52.256990 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:52 crc kubenswrapper[4909]: E1201 10:32:52.257111 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:52 crc kubenswrapper[4909]: E1201 10:32:52.257669 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.258055 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:32:52 crc kubenswrapper[4909]: E1201 10:32:52.258288 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.274043 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.274089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.274098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.274113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.274123 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.376713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.376855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.376920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.376962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.376980 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.480428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.480491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.480506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.480533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.480550 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.583978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.584067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.584087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.584117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.584137 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.687503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.687818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.687953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.688068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.688159 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.790356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.790403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.790415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.790437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.790454 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.894003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.894406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.894611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.894784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:52 crc kubenswrapper[4909]: I1201 10:32:52.894968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:52Z","lastTransitionTime":"2025-12-01T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.005821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.005906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.005916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.005936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.005946 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.108735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.108786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.108796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.108812 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.108822 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.211422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.211462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.211470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.211486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.211496 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.256599 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:53 crc kubenswrapper[4909]: E1201 10:32:53.256906 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.278574 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.304632 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db1501e3-b64b-4bbf-97ec-85f97fb68afb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebefb90c46fea58ca2492708469b770d49584875120d11d61abf02decceb5628\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b426a55009d5049fc085b07abe80dc2908fd38d298a4aa6f095a9cd3ae6942b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7319b7253d1509792909b52513e7dc00e0846e5d9081e4bb8edcc97d82a42b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03a230a4e6ebf6fc39127277079ee8376348cd166850afab303951bad6ed09a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c3d45f9b7bf94a695788f086124c388dc13869bd417b322b8df7844509a96bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31055d6c6a330a2ffb124ea9bd721daacfbdb5f671b2b38a6ab16ff160d6c05e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd48998d5c7cc9948a0688dc8c7eed869f92b4231ccc1f41141739242b0d1fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sm45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hr4n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.315830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.316479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.316672 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.317147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.317324 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.324733 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z48j9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dca0394a-c980-4220-ab44-d2f55519cb1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mm7r6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:57Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z48j9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.340242 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9c800637506567cb75b352adad135c41dbe62562f26b644f5c8736850a7d999\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.354747 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50fdee9b8440d524df44bbb343a6a5111608a99e056af7dd7c335ca01a2df01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b28934f86fcdefd8f584fb6747fae5c242223fefb89d8fe00495e3376d7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.377008 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57aeccf3-ec18-4a73-bd74-9b188de510ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:37Z\\\",\\\"message\\\":\\\"/node-ca-qggws in node crc\\\\nI1201 10:32:37.215721 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-qggws after 0 failed attempt(s)\\\\nI1201 10:32:37.215727 6920 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-qggws\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-2qpdc after 0 failed attempt(s)\\\\nI1201 10:32:37.215737 6920 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-2qpdc\\\\nI1201 10:32:37.214382 6920 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1201 10:32:37.215744 6920 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF1201 10:32:37.214392 6920 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:32:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xh2w8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5rks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.391992 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"672850e4-d044-44cc-b8a2-517dc1a285be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f594b0d461b877d9aea304378de3e2b4dddb41b3609b237d4d810c1a4a2945da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4pcf2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.404249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8daea63e-f4d3-4786-8917-e1a93eee0df8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37f14edf70404ba216b5dc2e1aaad7f289144fd1d6361148cd7a93232140469c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9317725fa67399cc2be4bca84c2bbf5d3dd611420ebc76c9a995a4f2dac6d010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.420750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.421041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.421055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.421080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.421093 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.430610 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae9522b-7d98-4517-bd38-2ceb100b6bfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2d4ee32a0ef470ee47bbedfb318af9b8d676ecd3a9f1977b585d260e8736ea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d4cf9759b113ca056a6a527f89a649ddac8039d1c8a1782c6fed8d36edce24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6f789fd37c3a9fe247bbc30045a2c67e847ba8504544c5803b2f44dbc48ba1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89ce1ad4cf0fa5e67405315aa05fa807a8c1c4956b5c9d331d978ebf4d2ef7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb6ee6ce36b5c8e44dc1f5101a3594eab4161b3b526a8b2d8c6886b5aa9d6094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec39ba18677ea4d848f1ec0be94c90b15649928679f31bf71565d6da5155fc7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97f0ec0f49d43c6600c8986ffca3dc626e2ab696144935a71e86923c1ddf0fb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590994c4acaa88fbf63fad249a09873c7c5f44d9ee54126d2022145e383d3c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.445934 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01a3c44-f18a-4365-b6b9-9ce4cb861fb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab9c8b94b28c26295dd84b086b303ef305ba4eb535a78feb9b3ebc518981b5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e637a5650d9e4eb9e3c560e6a5f7ee90b0c0b01cd2144e81ed740682ce51bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7aaa49321c3c5998d7b776d8f0fc66d0bcb039c32894d0d45d148f58747df6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.470007 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2qpdc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89f06a94-5047-41d9-90a3-8433149d22c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:32:29Z\\\",\\\"message\\\":\\\"2025-12-01T10:31:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78\\\\n2025-12-01T10:31:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00c808ab-b226-48b0-a049-ed55cc130f78 to /host/opt/cni/bin/\\\\n2025-12-01T10:31:44Z [verbose] multus-daemon started\\\\n2025-12-01T10:31:44Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:32:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6mwrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:43Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2qpdc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.483197 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qggws" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b27824f-0660-47f4-b7d7-dbe4b908854c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9571fc42b9b51641b6a293f261d95923560d1e3f62c7e5a314328beaf0bfd8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq9t8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qggws\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.495116 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caaa6b24-fb98-4908-b7a7-929c44181c99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc679abce825df1b24d356105ee755209df36f9bbaf961f7a448e30a7561b24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc42c2cf8c051b5eaf0d65edea6f2fccc58ef6a798030d1e16ab714d57916f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sd82w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8dv5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.507548 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00ec53d-b04b-4ac4-b626-c0582bda7471\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5daaa83677d79853eb2fee9d9c23a3b0cdc605ed7cbdc9035398272dec901f33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df23df31cee6a5831601de7b58d6e70b19456b9e410df2b2061be651927a1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ab7863aa3997f849fe685ae36085186463a760ab10187231605eb4a1bc181b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://399838e4442f6b2e73986a31a47c30cc1fbbe45693a3f304f6b72f6f210565c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.520826 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43b14afb-15c2-4260-9e25-008f9466724b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 10:31:35.657903 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 10:31:35.658891 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1746381920/tls.crt::/tmp/serving-cert-1746381920/tls.key\\\\\\\"\\\\nI1201 10:31:41.529558 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 10:31:41.533343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 10:31:41.533373 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 10:31:41.533404 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 10:31:41.533411 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 10:31:41.538561 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1201 10:31:41.538561 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 10:31:41.538592 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538598 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 10:31:41.538603 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 10:31:41.538607 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 10:31:41.538609 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 10:31:41.538612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1201 10:31:41.540238 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T10:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T10:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.523934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.524062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.524078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.524098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.524112 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.532741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b35982a4849d5e58ae6f30dc51213bd1cefce48548d93e34b212f7be06e43d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.544502 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.556001 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.567041 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tq5mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b1085bc-c2a2-4155-a342-30a9db598319\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:31:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edbb8c311088594051cdf3a26532a59cc1886883c7922aa573b7c9e605a3be52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrbgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T10:31:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tq5mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:53Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.626742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.626808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.626822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.626840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.626853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.729085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.729136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.729148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.729166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.729175 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.831898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.831955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.831967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.831985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.831998 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.935377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.935445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.935493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.935513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:53 crc kubenswrapper[4909]: I1201 10:32:53.935525 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:53Z","lastTransitionTime":"2025-12-01T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.038934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.039009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.039042 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.039062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.039073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.142219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.142276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.142323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.142344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.142356 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.245361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.245446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.245463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.245483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.245499 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.256725 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.257098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:54 crc kubenswrapper[4909]: E1201 10:32:54.257513 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:54 crc kubenswrapper[4909]: E1201 10:32:54.257985 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.258005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:54 crc kubenswrapper[4909]: E1201 10:32:54.258285 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.348375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.348420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.348432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.348449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.348460 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.451610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.451671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.451686 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.451707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.451722 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.555558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.555614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.555631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.555654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.555672 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.659102 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.659158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.659172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.659194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.659208 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.761228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.761272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.761287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.761304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.761315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.863977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.864025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.864039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.864058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.864075 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.966772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.966815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.966829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.966845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:54 crc kubenswrapper[4909]: I1201 10:32:54.966857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:54Z","lastTransitionTime":"2025-12-01T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.069448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.069496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.069506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.069520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.069532 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.172339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.172392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.172403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.172427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.172439 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.256413 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:55 crc kubenswrapper[4909]: E1201 10:32:55.256576 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.274113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.274170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.274187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.274212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.274232 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.376795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.376859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.376890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.376907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.376920 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.479834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.480342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.480459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.480717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.480852 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.583537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.583594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.583605 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.583621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.583631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.688772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.688853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.688871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.688905 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.688930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.791230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.791354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.791369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.791387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.791400 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.893560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.893617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.893630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.893650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.893663 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.995870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.995941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.995959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.995978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:55 crc kubenswrapper[4909]: I1201 10:32:55.995991 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:55Z","lastTransitionTime":"2025-12-01T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.099123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.099164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.099175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.099191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.099204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.201920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.202193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.202356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.202589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.202744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.256626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:56 crc kubenswrapper[4909]: E1201 10:32:56.257155 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.256688 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:56 crc kubenswrapper[4909]: E1201 10:32:56.257388 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.256654 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:56 crc kubenswrapper[4909]: E1201 10:32:56.257563 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.306053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.306105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.306115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.306156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.306170 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.409073 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.409114 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.409122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.409136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.409144 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.511728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.511763 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.511772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.511786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.511796 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.613925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.613960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.613969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.613985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.614004 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.716168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.716208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.716221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.716238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.716251 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.818642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.818690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.818699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.818722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.818733 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.921462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.921508 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.921516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.921535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:56 crc kubenswrapper[4909]: I1201 10:32:56.921547 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:56Z","lastTransitionTime":"2025-12-01T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.023818 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.024178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.024262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.024348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.024429 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.127356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.127469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.127484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.127503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.127515 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.230248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.230291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.230302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.230319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.230330 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.256209 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.256375 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.333252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.333326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.333339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.333355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.333383 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.436602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.436659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.436668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.436682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.436693 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.540045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.540123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.540141 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.540169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.540189 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.642702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.642771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.642787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.642806 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.642818 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.745355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.745423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.745441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.745469 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.745486 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.747382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.747497 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.747570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.747638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.747671 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.769213 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.774357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.774419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.774436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.774457 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.774473 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.792464 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.797952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.798006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.798018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.798036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.798047 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.810744 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.815391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.815422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.815433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.815466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.815480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.828105 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.832314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.832374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.832393 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.832421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.832441 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.847592 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"578ee329-32ca-4325-930b-3c9b1b6c332b\\\",\\\"systemUUID\\\":\\\"b132f599-ba64-4f09-b8b2-2af8c2f13405\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:32:57Z is after 2025-08-24T17:21:41Z" Dec 01 10:32:57 crc kubenswrapper[4909]: E1201 10:32:57.847752 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.849741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.849786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.849800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.849824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.849841 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.952561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.952613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.952623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.952639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:57 crc kubenswrapper[4909]: I1201 10:32:57.952649 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:57Z","lastTransitionTime":"2025-12-01T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.055563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.055615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.055627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.055644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.055656 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.158831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.158887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.158899 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.158917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.158926 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.256602 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.256652 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.256713 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:32:58 crc kubenswrapper[4909]: E1201 10:32:58.256763 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:32:58 crc kubenswrapper[4909]: E1201 10:32:58.256895 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:32:58 crc kubenswrapper[4909]: E1201 10:32:58.256985 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.262104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.262145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.262162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.262180 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.262192 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.363557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.363598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.363609 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.363630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.363642 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.465490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.465545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.465559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.465579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.465592 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.567435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.567478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.567487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.567502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.567511 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.670082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.670117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.670125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.670138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.670148 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.772371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.772444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.772462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.772484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.772500 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.875366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.875432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.875451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.875478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.875496 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.978144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.978185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.978197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.978215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:58 crc kubenswrapper[4909]: I1201 10:32:58.978228 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:58Z","lastTransitionTime":"2025-12-01T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.080734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.080787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.080800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.080815 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.080826 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.183412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.183478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.183491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.183510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.183523 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.257201 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:32:59 crc kubenswrapper[4909]: E1201 10:32:59.257383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.284938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.284974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.284982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.284995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.285003 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.387590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.387635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.387648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.387692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.387707 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.507778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.507852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.507863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.507893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.507903 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.610168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.610201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.610212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.610231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.610247 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.712785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.712837 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.712851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.712894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.712908 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.815098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.815136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.815145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.815159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.815168 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.917439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.917564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.917579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.917596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:32:59 crc kubenswrapper[4909]: I1201 10:32:59.917606 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:32:59Z","lastTransitionTime":"2025-12-01T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.019820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.019862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.019870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.019986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.019995 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.122990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.123043 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.123055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.123074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.123086 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.226017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.226069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.226079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.226097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.226110 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.256848 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.256984 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.257008 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:00 crc kubenswrapper[4909]: E1201 10:33:00.257095 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:00 crc kubenswrapper[4909]: E1201 10:33:00.257288 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:00 crc kubenswrapper[4909]: E1201 10:33:00.257308 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.328825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.328890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.328901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.328915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.328926 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.431689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.431745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.431755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.431771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.431781 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.535178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.535293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.535326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.535363 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.535391 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.637801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.637846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.637856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.637894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.637909 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.740194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.740232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.740241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.740254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.740263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.842621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.842663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.842672 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.842686 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.842696 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.944517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.944559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.944573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.944590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:00 crc kubenswrapper[4909]: I1201 10:33:00.944600 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:00Z","lastTransitionTime":"2025-12-01T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.046062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.046092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.046101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.046120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.046140 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.148613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.148656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.148674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.148689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.148699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.251770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.251808 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.251823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.251911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.251924 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.256516 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:01 crc kubenswrapper[4909]: E1201 10:33:01.256727 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.345749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:01 crc kubenswrapper[4909]: E1201 10:33:01.345996 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:33:01 crc kubenswrapper[4909]: E1201 10:33:01.346078 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs podName:dca0394a-c980-4220-ab44-d2f55519cb1a nodeName:}" failed. No retries permitted until 2025-12-01 10:34:05.346053555 +0000 UTC m=+162.580524483 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs") pod "network-metrics-daemon-z48j9" (UID: "dca0394a-c980-4220-ab44-d2f55519cb1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.354494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.354569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.354581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.354599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.354613 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.457019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.457076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.457085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.457098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.457125 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.564651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.564725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.564755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.564780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.564799 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.667661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.667714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.667723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.667739 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.667751 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.770196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.770307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.770329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.770360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.770382 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.872566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.872611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.872621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.872636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.872647 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.975693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.975785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.975811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.976357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:01 crc kubenswrapper[4909]: I1201 10:33:01.976583 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:01Z","lastTransitionTime":"2025-12-01T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.080126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.080201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.080223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.080258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.080281 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.183555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.183608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.183620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.183635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.183645 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.256844 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.256928 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.256855 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:02 crc kubenswrapper[4909]: E1201 10:33:02.257086 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:02 crc kubenswrapper[4909]: E1201 10:33:02.257246 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:02 crc kubenswrapper[4909]: E1201 10:33:02.257352 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.285364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.285417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.285430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.285448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.285459 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.387976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.388036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.388053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.388077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.388094 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.490221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.490269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.490283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.490300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.490315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.593119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.593168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.593181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.593203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.593215 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.695918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.695986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.696002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.696019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.696030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.799150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.799209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.799222 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.799244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.799257 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.902096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.902165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.902176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.902195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:02 crc kubenswrapper[4909]: I1201 10:33:02.902205 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:02Z","lastTransitionTime":"2025-12-01T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.004148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.004394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.004431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.004459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.004477 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.108764 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.108822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.108835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.108856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.108867 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.213198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.213268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.213280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.213301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.213345 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.257149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:03 crc kubenswrapper[4909]: E1201 10:33:03.257471 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.315927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.316225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.316431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.316553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.316659 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.350266 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hr4n5" podStartSLOduration=81.350247841 podStartE2EDuration="1m21.350247841s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.324774789 +0000 UTC m=+100.559245697" watchObservedRunningTime="2025-12-01 10:33:03.350247841 +0000 UTC m=+100.584718739" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.350413 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.350409266 podStartE2EDuration="1m19.350409266s" podCreationTimestamp="2025-12-01 10:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.348821262 +0000 UTC m=+100.583292170" watchObservedRunningTime="2025-12-01 10:33:03.350409266 +0000 UTC m=+100.584880164" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.419633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.419898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.420020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.420106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.420179 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.477893 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podStartSLOduration=81.477854576 podStartE2EDuration="1m21.477854576s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.477406401 +0000 UTC m=+100.711877319" watchObservedRunningTime="2025-12-01 10:33:03.477854576 +0000 UTC m=+100.712325474" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.517996 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.517975033 podStartE2EDuration="1m21.517975033s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.517027821 +0000 UTC m=+100.751498739" watchObservedRunningTime="2025-12-01 10:33:03.517975033 +0000 UTC m=+100.752445931" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.518804 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.51879889 podStartE2EDuration="31.51879889s" podCreationTimestamp="2025-12-01 10:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.491125815 +0000 UTC m=+100.725596733" watchObservedRunningTime="2025-12-01 10:33:03.51879889 +0000 UTC m=+100.753269788" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.522635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.522753 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.522821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.522903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.522984 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.545447 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2qpdc" podStartSLOduration=81.545415301 podStartE2EDuration="1m21.545415301s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.543844757 +0000 UTC m=+100.778315695" watchObservedRunningTime="2025-12-01 10:33:03.545415301 +0000 UTC m=+100.779886259" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.555471 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qggws" podStartSLOduration=81.55544988 podStartE2EDuration="1m21.55544988s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.554145796 +0000 UTC m=+100.788616734" watchObservedRunningTime="2025-12-01 10:33:03.55544988 +0000 UTC m=+100.789920778" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.570675 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8dv5p" podStartSLOduration=80.570646624 podStartE2EDuration="1m20.570646624s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.568183161 +0000 UTC m=+100.802654089" watchObservedRunningTime="2025-12-01 10:33:03.570646624 +0000 UTC m=+100.805117532" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.589667 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.589629536 podStartE2EDuration="48.589629536s" podCreationTimestamp="2025-12-01 10:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.588948593 +0000 UTC m=+100.823419511" watchObservedRunningTime="2025-12-01 10:33:03.589629536 +0000 UTC m=+100.824100434" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.608370 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.608339299 podStartE2EDuration="1m22.608339299s" podCreationTimestamp="2025-12-01 10:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.605977489 +0000 UTC m=+100.840448397" watchObservedRunningTime="2025-12-01 10:33:03.608339299 +0000 UTC m=+100.842810217" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.617922 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tq5mk" podStartSLOduration=82.617899512 podStartE2EDuration="1m22.617899512s" podCreationTimestamp="2025-12-01 10:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:03.617378185 +0000 UTC m=+100.851849093" watchObservedRunningTime="2025-12-01 10:33:03.617899512 +0000 UTC m=+100.852370420" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.625771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.625846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.625903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.625924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.625937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.728845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.728915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.728928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.728947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.728956 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.831974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.832045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.832067 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.832096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.832114 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.934807 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.934930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.934952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.934976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:03 crc kubenswrapper[4909]: I1201 10:33:03.934992 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:03Z","lastTransitionTime":"2025-12-01T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.037736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.038028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.038140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.038212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.038277 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.140915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.140957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.140965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.140985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.140995 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.244982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.245059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.245074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.245101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.245116 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.256376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.256443 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:04 crc kubenswrapper[4909]: E1201 10:33:04.256596 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.256619 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:04 crc kubenswrapper[4909]: E1201 10:33:04.256768 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:04 crc kubenswrapper[4909]: E1201 10:33:04.256927 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.349328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.349394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.349410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.349434 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.349450 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.452981 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.453071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.453084 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.453105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.453120 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.555496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.556471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.556572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.556667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.556752 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.659940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.660027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.660051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.660083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.660107 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.763153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.763403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.763527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.763597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.763660 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.866696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.866748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.866762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.866813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.866830 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.969736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.969819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.969838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.969910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:04 crc kubenswrapper[4909]: I1201 10:33:04.969932 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:04Z","lastTransitionTime":"2025-12-01T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.073123 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.073185 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.073202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.073230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.073247 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.176840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.177224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.177380 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.177588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.177757 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.256932 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:05 crc kubenswrapper[4909]: E1201 10:33:05.257129 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.280131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.280179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.280192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.280209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.280220 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.383213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.383262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.383274 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.383294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.383307 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.485805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.485862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.485920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.485945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.485961 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.588514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.588556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.588566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.588580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.588589 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.692237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.692270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.692281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.692296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.692308 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.795191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.795234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.795245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.795262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.795274 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.898104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.898207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.898233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.898264 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:05 crc kubenswrapper[4909]: I1201 10:33:05.898287 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:05Z","lastTransitionTime":"2025-12-01T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.001277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.001380 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.001406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.001439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.001465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.103966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.104029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.104047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.104075 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.104091 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.207101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.207478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.207591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.207695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.207777 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.256937 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.257010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.257041 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:06 crc kubenswrapper[4909]: E1201 10:33:06.257164 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:06 crc kubenswrapper[4909]: E1201 10:33:06.257289 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:06 crc kubenswrapper[4909]: E1201 10:33:06.257921 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.310065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.310138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.310162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.310193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.310216 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.413482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.413914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.414029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.414130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.414208 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.516656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.516723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.516747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.516776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.516796 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.618592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.618634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.618642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.618657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.618694 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.720759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.720807 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.720819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.720836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.720846 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.823593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.823639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.823648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.823670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.823680 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.925766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.925803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.925814 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.925832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:06 crc kubenswrapper[4909]: I1201 10:33:06.925842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:06Z","lastTransitionTime":"2025-12-01T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.028914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.028966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.028979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.028995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.029005 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.131813 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.131941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.131954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.131974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.131986 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.233955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.234002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.234011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.234046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.234058 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.256508 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:07 crc kubenswrapper[4909]: E1201 10:33:07.256957 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.258466 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:33:07 crc kubenswrapper[4909]: E1201 10:33:07.258759 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5rks_openshift-ovn-kubernetes(57aeccf3-ec18-4a73-bd74-9b188de510ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.336104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.336317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.336403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.336467 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.336527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.438615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.438929 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.439026 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.439122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.439206 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.542089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.542134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.542147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.542166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.542179 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.644172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.644212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.644223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.644236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.644246 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.746381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.746676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.746743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.746810 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.746871 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.849593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.849636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.849648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.849667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.849679 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.952263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.952335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.952357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.952381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:07 crc kubenswrapper[4909]: I1201 10:33:07.952398 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:07Z","lastTransitionTime":"2025-12-01T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.055865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.055999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.056025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.056056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.056080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:08Z","lastTransitionTime":"2025-12-01T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.160188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.160263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.160286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.160318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.160339 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:08Z","lastTransitionTime":"2025-12-01T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.249024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.249322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.249437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.249542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.249734 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:08Z","lastTransitionTime":"2025-12-01T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.256444 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.256532 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:08 crc kubenswrapper[4909]: E1201 10:33:08.256620 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:08 crc kubenswrapper[4909]: E1201 10:33:08.256702 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.256837 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:08 crc kubenswrapper[4909]: E1201 10:33:08.257044 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.275364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.275627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.275746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.275847 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.275989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:33:08Z","lastTransitionTime":"2025-12-01T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.313406 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h"] Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.313867 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.315928 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.316992 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.317210 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.317297 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.422116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97bd7a6-ab18-4aba-88c2-37952eb7461e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.422163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97bd7a6-ab18-4aba-88c2-37952eb7461e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.422184 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a97bd7a6-ab18-4aba-88c2-37952eb7461e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.422223 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.422466 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97bd7a6-ab18-4aba-88c2-37952eb7461e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97bd7a6-ab18-4aba-88c2-37952eb7461e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a97bd7a6-ab18-4aba-88c2-37952eb7461e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524340 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.524570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a97bd7a6-ab18-4aba-88c2-37952eb7461e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.526226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a97bd7a6-ab18-4aba-88c2-37952eb7461e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.530639 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a97bd7a6-ab18-4aba-88c2-37952eb7461e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.545345 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97bd7a6-ab18-4aba-88c2-37952eb7461e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wfg9h\" (UID: \"a97bd7a6-ab18-4aba-88c2-37952eb7461e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.627575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" Dec 01 10:33:08 crc kubenswrapper[4909]: W1201 10:33:08.648126 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda97bd7a6_ab18_4aba_88c2_37952eb7461e.slice/crio-1afe5db7b2530825b6f6a08029384fc49cdb41e478e39987307fd9c3d370fc56 WatchSource:0}: Error finding container 1afe5db7b2530825b6f6a08029384fc49cdb41e478e39987307fd9c3d370fc56: Status 404 returned error can't find the container with id 1afe5db7b2530825b6f6a08029384fc49cdb41e478e39987307fd9c3d370fc56 Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.837050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" event={"ID":"a97bd7a6-ab18-4aba-88c2-37952eb7461e","Type":"ContainerStarted","Data":"d705bb147187b6e094c60ebe4fef1320bc8b143ffaabdabca3e29d05f508f2b4"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.837100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" event={"ID":"a97bd7a6-ab18-4aba-88c2-37952eb7461e","Type":"ContainerStarted","Data":"1afe5db7b2530825b6f6a08029384fc49cdb41e478e39987307fd9c3d370fc56"} Dec 01 10:33:08 crc kubenswrapper[4909]: I1201 10:33:08.855249 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wfg9h" podStartSLOduration=86.855210636 podStartE2EDuration="1m26.855210636s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:08.854305906 +0000 UTC m=+106.088776814" watchObservedRunningTime="2025-12-01 10:33:08.855210636 +0000 UTC m=+106.089681544" Dec 01 10:33:09 crc kubenswrapper[4909]: I1201 10:33:09.258246 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:09 crc kubenswrapper[4909]: E1201 10:33:09.258429 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:10 crc kubenswrapper[4909]: I1201 10:33:10.256755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:10 crc kubenswrapper[4909]: I1201 10:33:10.256810 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:10 crc kubenswrapper[4909]: E1201 10:33:10.256926 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:10 crc kubenswrapper[4909]: E1201 10:33:10.257031 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:10 crc kubenswrapper[4909]: I1201 10:33:10.257680 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:10 crc kubenswrapper[4909]: E1201 10:33:10.257871 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:11 crc kubenswrapper[4909]: I1201 10:33:11.256386 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:11 crc kubenswrapper[4909]: E1201 10:33:11.256530 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:12 crc kubenswrapper[4909]: I1201 10:33:12.257068 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:12 crc kubenswrapper[4909]: I1201 10:33:12.257068 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:12 crc kubenswrapper[4909]: E1201 10:33:12.257527 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:12 crc kubenswrapper[4909]: I1201 10:33:12.257083 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:12 crc kubenswrapper[4909]: E1201 10:33:12.257670 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:12 crc kubenswrapper[4909]: E1201 10:33:12.257282 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:13 crc kubenswrapper[4909]: I1201 10:33:13.256544 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:13 crc kubenswrapper[4909]: E1201 10:33:13.260008 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:14 crc kubenswrapper[4909]: I1201 10:33:14.256311 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:14 crc kubenswrapper[4909]: I1201 10:33:14.256354 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:14 crc kubenswrapper[4909]: I1201 10:33:14.256442 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:14 crc kubenswrapper[4909]: E1201 10:33:14.256668 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:14 crc kubenswrapper[4909]: E1201 10:33:14.256604 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:14 crc kubenswrapper[4909]: E1201 10:33:14.256735 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:15 crc kubenswrapper[4909]: I1201 10:33:15.256749 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:15 crc kubenswrapper[4909]: E1201 10:33:15.257517 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.256778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.256953 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:16 crc kubenswrapper[4909]: E1201 10:33:16.257070 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.256821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:16 crc kubenswrapper[4909]: E1201 10:33:16.257336 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:16 crc kubenswrapper[4909]: E1201 10:33:16.257455 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.866765 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/1.log" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.868743 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/0.log" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.868955 4909 generic.go:334] "Generic (PLEG): container finished" podID="89f06a94-5047-41d9-90a3-8433149d22c4" containerID="73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5" exitCode=1 Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.869031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerDied","Data":"73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5"} Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.869094 4909 scope.go:117] "RemoveContainer" containerID="74017f3d7ce0ad9f48ef1ad725e01a64f2309b4b7a1408dab0ea3ae64e107784" Dec 01 10:33:16 crc kubenswrapper[4909]: I1201 10:33:16.869963 4909 scope.go:117] "RemoveContainer" containerID="73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5" Dec 01 10:33:16 crc kubenswrapper[4909]: E1201 10:33:16.870304 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2qpdc_openshift-multus(89f06a94-5047-41d9-90a3-8433149d22c4)\"" pod="openshift-multus/multus-2qpdc" podUID="89f06a94-5047-41d9-90a3-8433149d22c4" Dec 01 10:33:17 crc kubenswrapper[4909]: I1201 10:33:17.256511 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:17 crc kubenswrapper[4909]: E1201 10:33:17.257287 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:17 crc kubenswrapper[4909]: I1201 10:33:17.876518 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/1.log" Dec 01 10:33:18 crc kubenswrapper[4909]: I1201 10:33:18.256576 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:18 crc kubenswrapper[4909]: I1201 10:33:18.256676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:18 crc kubenswrapper[4909]: I1201 10:33:18.256605 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:18 crc kubenswrapper[4909]: E1201 10:33:18.256869 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:18 crc kubenswrapper[4909]: E1201 10:33:18.257248 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:18 crc kubenswrapper[4909]: E1201 10:33:18.257419 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:19 crc kubenswrapper[4909]: I1201 10:33:19.256570 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:19 crc kubenswrapper[4909]: E1201 10:33:19.256742 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:20 crc kubenswrapper[4909]: I1201 10:33:20.256825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:20 crc kubenswrapper[4909]: I1201 10:33:20.256825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:20 crc kubenswrapper[4909]: E1201 10:33:20.257210 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:20 crc kubenswrapper[4909]: E1201 10:33:20.257061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:20 crc kubenswrapper[4909]: I1201 10:33:20.256835 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:20 crc kubenswrapper[4909]: E1201 10:33:20.257383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:21 crc kubenswrapper[4909]: I1201 10:33:21.257061 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:21 crc kubenswrapper[4909]: E1201 10:33:21.257220 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.256329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:22 crc kubenswrapper[4909]: E1201 10:33:22.256484 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.256697 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:22 crc kubenswrapper[4909]: E1201 10:33:22.256764 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.256955 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:22 crc kubenswrapper[4909]: E1201 10:33:22.257087 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.257910 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.897619 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/3.log" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.899899 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerStarted","Data":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.900355 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:33:22 crc kubenswrapper[4909]: I1201 10:33:22.926671 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podStartSLOduration=100.926653058 podStartE2EDuration="1m40.926653058s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:22.925699386 +0000 UTC m=+120.160170304" watchObservedRunningTime="2025-12-01 10:33:22.926653058 +0000 UTC m=+120.161123956" Dec 01 10:33:23 crc kubenswrapper[4909]: I1201 10:33:23.055867 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z48j9"] Dec 01 10:33:23 crc kubenswrapper[4909]: I1201 10:33:23.056052 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:23 crc kubenswrapper[4909]: E1201 10:33:23.056156 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:23 crc kubenswrapper[4909]: E1201 10:33:23.259353 4909 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 10:33:23 crc kubenswrapper[4909]: E1201 10:33:23.354398 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:33:24 crc kubenswrapper[4909]: I1201 10:33:24.256915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:24 crc kubenswrapper[4909]: E1201 10:33:24.257069 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:24 crc kubenswrapper[4909]: I1201 10:33:24.257188 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:24 crc kubenswrapper[4909]: I1201 10:33:24.256809 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:24 crc kubenswrapper[4909]: I1201 10:33:24.257212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:24 crc kubenswrapper[4909]: E1201 10:33:24.257434 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:24 crc kubenswrapper[4909]: E1201 10:33:24.257672 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:24 crc kubenswrapper[4909]: E1201 10:33:24.257922 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:26 crc kubenswrapper[4909]: I1201 10:33:26.256990 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:26 crc kubenswrapper[4909]: I1201 10:33:26.257045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:26 crc kubenswrapper[4909]: E1201 10:33:26.257116 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:26 crc kubenswrapper[4909]: E1201 10:33:26.257247 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:26 crc kubenswrapper[4909]: I1201 10:33:26.257388 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:26 crc kubenswrapper[4909]: E1201 10:33:26.257647 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:26 crc kubenswrapper[4909]: I1201 10:33:26.258192 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:26 crc kubenswrapper[4909]: E1201 10:33:26.258429 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:28 crc kubenswrapper[4909]: I1201 10:33:28.256802 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:28 crc kubenswrapper[4909]: I1201 10:33:28.256912 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:28 crc kubenswrapper[4909]: I1201 10:33:28.256815 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:28 crc kubenswrapper[4909]: E1201 10:33:28.257113 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:28 crc kubenswrapper[4909]: E1201 10:33:28.257240 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:28 crc kubenswrapper[4909]: E1201 10:33:28.257395 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:28 crc kubenswrapper[4909]: I1201 10:33:28.257963 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:28 crc kubenswrapper[4909]: E1201 10:33:28.258222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:28 crc kubenswrapper[4909]: E1201 10:33:28.356758 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:33:29 crc kubenswrapper[4909]: I1201 10:33:29.256741 4909 scope.go:117] "RemoveContainer" containerID="73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5" Dec 01 10:33:29 crc kubenswrapper[4909]: I1201 10:33:29.927938 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/1.log" Dec 01 10:33:29 crc kubenswrapper[4909]: I1201 10:33:29.928210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerStarted","Data":"f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026"} Dec 01 10:33:30 crc kubenswrapper[4909]: I1201 10:33:30.256900 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:30 crc kubenswrapper[4909]: I1201 10:33:30.256952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:30 crc kubenswrapper[4909]: I1201 10:33:30.256960 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:30 crc kubenswrapper[4909]: I1201 10:33:30.257028 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:30 crc kubenswrapper[4909]: E1201 10:33:30.257131 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:30 crc kubenswrapper[4909]: E1201 10:33:30.257183 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:30 crc kubenswrapper[4909]: E1201 10:33:30.257270 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:30 crc kubenswrapper[4909]: E1201 10:33:30.257420 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:32 crc kubenswrapper[4909]: I1201 10:33:32.256642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:32 crc kubenswrapper[4909]: I1201 10:33:32.256716 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:32 crc kubenswrapper[4909]: I1201 10:33:32.256684 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:32 crc kubenswrapper[4909]: I1201 10:33:32.256642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:32 crc kubenswrapper[4909]: E1201 10:33:32.256970 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z48j9" podUID="dca0394a-c980-4220-ab44-d2f55519cb1a" Dec 01 10:33:32 crc kubenswrapper[4909]: E1201 10:33:32.257277 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:33:32 crc kubenswrapper[4909]: E1201 10:33:32.257434 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:33:32 crc kubenswrapper[4909]: E1201 10:33:32.257579 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.256189 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.256335 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.256378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.256646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.260304 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.260317 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.260510 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.260999 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.261128 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:33:34 crc kubenswrapper[4909]: I1201 10:33:34.261355 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.499667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.552755 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.553227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.559156 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.560546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.563258 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.563833 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.564895 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.568001 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2n2sj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.568839 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.569426 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.569628 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.570202 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.570199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574216 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcn97\" (UniqueName: \"kubernetes.io/projected/987cc233-91d0-4ed1-8d93-62e90e5e0925-kube-api-access-xcn97\") pod \"downloads-7954f5f757-2n2sj\" (UID: \"987cc233-91d0-4ed1-8d93-62e90e5e0925\") " pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-machine-approver-tls\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlrh\" (UniqueName: \"kubernetes.io/projected/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-kube-api-access-8xlrh\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574434 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftds\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-kube-api-access-wftds\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c5cfaf-052d-46ac-a977-41f70c6d0368-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574563 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78c5s\" (UniqueName: \"kubernetes.io/projected/46c5cfaf-052d-46ac-a977-41f70c6d0368-kube-api-access-78c5s\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574614 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtnq\" (UniqueName: \"kubernetes.io/projected/b57f31c3-dad4-47c9-88e7-369524b49e42-kube-api-access-shtnq\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574700 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c5cfaf-052d-46ac-a977-41f70c6d0368-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-auth-proxy-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b57f31c3-dad4-47c9-88e7-369524b49e42-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574869 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.574996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.575061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twvl5\" (UniqueName: \"kubernetes.io/projected/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-kube-api-access-twvl5\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.578932 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m2qzl"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.580408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.582818 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.584038 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.585227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.585945 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.629807 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.632561 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.635205 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.665167 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.665404 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.666957 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.666985 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.667334 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.667376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.668262 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.668804 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669215 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669413 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669525 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669629 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669409 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669740 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669826 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.669973 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670111 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670282 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670470 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670692 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.670926 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671003 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671130 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671151 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671500 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671573 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671705 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672066 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.671862 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672538 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672572 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672578 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672649 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672277 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672372 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672424 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672445 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.672461 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.673318 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.673598 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.673814 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.684866 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.685405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.685850 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.686754 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.687001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfp5d\" (UniqueName: \"kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.687085 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-auth-proxy-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.687150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-policies\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.687175 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r276z\" (UniqueName: \"kubernetes.io/projected/0491dd66-90bc-45d1-a72d-79dbe3f5711e-kube-api-access-r276z\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.687460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-client\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.701922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-auth-proxy-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.704460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b57f31c3-dad4-47c9-88e7-369524b49e42-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.704600 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.704755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.704797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twvl5\" (UniqueName: \"kubernetes.io/projected/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-kube-api-access-twvl5\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.704846 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.711244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.705225 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-dir\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce512d0d-7510-49cf-af98-c09f19031ab1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714477 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-image-import-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714570 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-client\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714596 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714618 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-serving-cert\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtt6p\" (UniqueName: \"kubernetes.io/projected/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-kube-api-access-rtt6p\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce512d0d-7510-49cf-af98-c09f19031ab1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcn97\" (UniqueName: \"kubernetes.io/projected/987cc233-91d0-4ed1-8d93-62e90e5e0925-kube-api-access-xcn97\") pod \"downloads-7954f5f757-2n2sj\" (UID: \"987cc233-91d0-4ed1-8d93-62e90e5e0925\") " pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714732 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4crbr\" (UniqueName: \"kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714756 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce512d0d-7510-49cf-af98-c09f19031ab1-config\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714788 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714806 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-node-pullsecrets\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-machine-approver-tls\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-encryption-config\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-serving-cert\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-encryption-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.715002 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.714992 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2pjzb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.715505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-config\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.715034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.715919 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720111 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlrh\" (UniqueName: \"kubernetes.io/projected/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-kube-api-access-8xlrh\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftds\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-kube-api-access-wftds\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c5cfaf-052d-46ac-a977-41f70c6d0368-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720369 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720388 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720454 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit-dir\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78c5s\" (UniqueName: \"kubernetes.io/projected/46c5cfaf-052d-46ac-a977-41f70c6d0368-kube-api-access-78c5s\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shtnq\" (UniqueName: \"kubernetes.io/projected/b57f31c3-dad4-47c9-88e7-369524b49e42-kube-api-access-shtnq\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720565 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720620 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.720648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c5cfaf-052d-46ac-a977-41f70c6d0368-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.721379 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-machine-approver-tls\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.721794 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5wkg2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.722503 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.724124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c5cfaf-052d-46ac-a977-41f70c6d0368-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.732613 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b57f31c3-dad4-47c9-88e7-369524b49e42-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.732652 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.741788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.743991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c5cfaf-052d-46ac-a977-41f70c6d0368-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.744748 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.745549 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.751275 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fklv5"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.751853 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.752242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.752583 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.752757 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.753329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.757885 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7pgb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.758519 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vh74j"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.758776 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.760403 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.760731 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.761357 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.763707 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.766027 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.766233 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.766623 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.766720 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.766893 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.767074 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.767176 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.767311 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.767964 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.769026 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.769316 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2q24w"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.769834 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.769979 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.770250 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.770572 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.770974 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.773041 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.773912 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.774538 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.774947 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.775523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtnq\" (UniqueName: \"kubernetes.io/projected/b57f31c3-dad4-47c9-88e7-369524b49e42-kube-api-access-shtnq\") pod \"cluster-samples-operator-665b6dd947-w8r4s\" (UID: \"b57f31c3-dad4-47c9-88e7-369524b49e42\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.783143 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.783890 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.784236 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j597g"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.784801 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.785423 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.785604 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.791005 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.791443 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.791919 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.792391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.794484 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.794830 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.796593 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.796797 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.796966 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.796989 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797115 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797150 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797157 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797697 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797753 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797913 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.797712 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.798539 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.799343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.799822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.800146 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.800293 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.800501 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.801145 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.810737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.811023 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.811154 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.811270 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.811378 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.811689 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.815008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twvl5\" (UniqueName: \"kubernetes.io/projected/e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0-kube-api-access-twvl5\") pod \"openshift-apiserver-operator-796bbdcf4f-2m6qp\" (UID: \"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.816437 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.818323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlrh\" (UniqueName: \"kubernetes.io/projected/fc4330e6-2bf8-4d0c-bae1-c0e1544a5642-kube-api-access-8xlrh\") pod \"machine-approver-56656f9798-hw7wm\" (UID: \"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.818322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78c5s\" (UniqueName: \"kubernetes.io/projected/46c5cfaf-052d-46ac-a977-41f70c6d0368-kube-api-access-78c5s\") pod \"openshift-controller-manager-operator-756b6f6bc6-644kz\" (UID: \"46c5cfaf-052d-46ac-a977-41f70c6d0368\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858292 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858381 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858504 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858676 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858676 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858790 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858889 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.858979 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfp5d\" (UniqueName: \"kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-policies\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860132 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860154 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r276z\" (UniqueName: \"kubernetes.io/projected/0491dd66-90bc-45d1-a72d-79dbe3f5711e-kube-api-access-r276z\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-client\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860244 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860290 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-dir\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce512d0d-7510-49cf-af98-c09f19031ab1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-image-import-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860375 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-client\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-serving-cert\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtt6p\" (UniqueName: \"kubernetes.io/projected/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-kube-api-access-rtt6p\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrh4\" (UniqueName: \"kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce512d0d-7510-49cf-af98-c09f19031ab1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860501 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4crbr\" (UniqueName: \"kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860535 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce512d0d-7510-49cf-af98-c09f19031ab1-config\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860570 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-node-pullsecrets\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860648 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-trusted-ca\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860672 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-encryption-config\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6c7\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-kube-api-access-5b6c7\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860737 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-serving-cert\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-encryption-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860837 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860854 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860912 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-metrics-tls\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.860986 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861038 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit-dir\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861144 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861149 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861181 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861428 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w927p"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.861940 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.862149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.862233 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-node-pullsecrets\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.862239 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.862573 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-policies\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.862951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.863075 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.863213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.863265 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-audit-dir\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.863295 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.864006 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.864498 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0491dd66-90bc-45d1-a72d-79dbe3f5711e-audit-dir\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.864928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.865066 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftds\" (UniqueName: \"kubernetes.io/projected/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-kube-api-access-wftds\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.865295 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.866331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.866495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0491dd66-90bc-45d1-a72d-79dbe3f5711e-image-import-ca\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.867348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce512d0d-7510-49cf-af98-c09f19031ab1-config\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.867516 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.867626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5f9554d-08bc-4e4a-8178-50cdc07e89c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d9vlp\" (UID: \"f5f9554d-08bc-4e4a-8178-50cdc07e89c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.867676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.867990 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-etcd-client\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.868232 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.868756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.868790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcn97\" (UniqueName: \"kubernetes.io/projected/987cc233-91d0-4ed1-8d93-62e90e5e0925-kube-api-access-xcn97\") pod \"downloads-7954f5f757-2n2sj\" (UID: \"987cc233-91d0-4ed1-8d93-62e90e5e0925\") " pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.868864 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.869344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce512d0d-7510-49cf-af98-c09f19031ab1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.869394 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-serving-cert\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.869530 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.869809 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-serving-cert\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.870173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.871241 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-encryption-config\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.871259 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.871394 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.872784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-encryption-config\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.874259 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.877486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0491dd66-90bc-45d1-a72d-79dbe3f5711e-etcd-client\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.877512 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.879493 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.879575 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.879814 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.879995 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.880113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.880390 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.880798 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.882036 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.882357 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.882706 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.885386 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.889324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.889450 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.897613 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zmhkr"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.898555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.899140 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.899538 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.900430 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.900734 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.901794 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.905552 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.908143 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.910902 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.914638 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.915349 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.917343 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.917385 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.918000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m2qzl"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.923912 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.925933 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2n2sj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.927351 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.928349 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.930151 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vh74j"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.930436 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.932363 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2pjzb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.933297 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fklv5"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.934557 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.936916 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.938084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.938542 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.938864 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.940167 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5wkg2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.941569 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.942817 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7pgb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.944311 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.945522 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.952250 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.954914 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dgdsb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.955625 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.956408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.961047 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w927p"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.961785 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.965283 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-trusted-ca\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6c7\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-kube-api-access-5b6c7\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968549 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-metrics-tls\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968766 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968824 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrh4\" (UniqueName: \"kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.968987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.969719 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" event={"ID":"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642","Type":"ContainerStarted","Data":"e8b6bcc67cd2768b329cf3b072838e11ed00b3261e8bc6ad160ae977f04ca768"} Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.969900 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.970229 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.970713 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.971227 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dkdgx"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.971227 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.972013 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.972739 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.973742 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.974252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.974472 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.975417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.975790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.976430 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.978161 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j597g"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.978166 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.978437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.980587 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.982114 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.983423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.983433 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.983664 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.985064 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zmhkr"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.986422 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.987170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.987683 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.989452 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k5k4b"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.990833 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.991153 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tgkkj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.991595 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.993961 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dgdsb"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.994007 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.997808 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.998124 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.998173 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tgkkj"] Dec 01 10:33:38 crc kubenswrapper[4909]: I1201 10:33:38.998921 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k5k4b"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.010380 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.011966 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.012700 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.016153 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.038544 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.057286 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.077465 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.096666 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.116973 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.136643 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.146642 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.155791 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.186222 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.196083 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.216777 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.236188 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.256609 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.276185 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.296473 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.316589 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.336088 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.343675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-metrics-tls\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.365472 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.372851 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.373288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-trusted-ca\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.375542 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: W1201 10:33:39.381018 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f9554d_08bc_4e4a_8178_50cdc07e89c6.slice/crio-76cec18ad3b3550dc52aa0d74a0456660333f9cdb54bedd44cfa5b29f9309b69 WatchSource:0}: Error finding container 76cec18ad3b3550dc52aa0d74a0456660333f9cdb54bedd44cfa5b29f9309b69: Status 404 returned error can't find the container with id 76cec18ad3b3550dc52aa0d74a0456660333f9cdb54bedd44cfa5b29f9309b69 Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.382420 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2n2sj"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.395801 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.416243 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.448867 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.455440 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.470528 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.473074 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz"] Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.476121 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.495558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.516456 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.536179 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:33:39 crc kubenswrapper[4909]: W1201 10:33:39.536289 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3dd4d9e_bd28_4db6_82ba_8c576f8e2de0.slice/crio-aad056fa1084aec2af7f67410ed19137feff31a9a718ff2b3b2a51c42fca7077 WatchSource:0}: Error finding container aad056fa1084aec2af7f67410ed19137feff31a9a718ff2b3b2a51c42fca7077: Status 404 returned error can't find the container with id aad056fa1084aec2af7f67410ed19137feff31a9a718ff2b3b2a51c42fca7077 Dec 01 10:33:39 crc kubenswrapper[4909]: W1201 10:33:39.541202 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c5cfaf_052d_46ac_a977_41f70c6d0368.slice/crio-01583761807fed1f2d2cbe290d07af808922d5f6b535dc9cf5832bd56527a28a WatchSource:0}: Error finding container 01583761807fed1f2d2cbe290d07af808922d5f6b535dc9cf5832bd56527a28a: Status 404 returned error can't find the container with id 01583761807fed1f2d2cbe290d07af808922d5f6b535dc9cf5832bd56527a28a Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.555465 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.577113 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.597435 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.618303 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.637162 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.657225 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.677089 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.696648 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.737224 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.757430 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.776234 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.798855 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.814423 4909 request.go:700] Waited for 1.019307317s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.816916 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.837195 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.857067 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.876617 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.895344 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.916759 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.936173 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.956162 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.975346 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" event={"ID":"46c5cfaf-052d-46ac-a977-41f70c6d0368","Type":"ContainerStarted","Data":"02b7d3e3c52f119a922be6ccc22db76585b694d173fb897d35d001dd4d7ce2be"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.975619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" event={"ID":"46c5cfaf-052d-46ac-a977-41f70c6d0368","Type":"ContainerStarted","Data":"01583761807fed1f2d2cbe290d07af808922d5f6b535dc9cf5832bd56527a28a"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.978498 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" event={"ID":"b57f31c3-dad4-47c9-88e7-369524b49e42","Type":"ContainerStarted","Data":"7cff54c034e5be8d3eaef4a0e32dfdf84e43591670dd5ba4da5c720e6fa7afc6"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.978611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" event={"ID":"b57f31c3-dad4-47c9-88e7-369524b49e42","Type":"ContainerStarted","Data":"1e0c45049a9f6819085d0a6ce5d826fda61ea5542a3678be29be944face6e1c6"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.978642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" event={"ID":"b57f31c3-dad4-47c9-88e7-369524b49e42","Type":"ContainerStarted","Data":"dc7c7e725c802f8d84a5d38722a737717422ad5418c0965a80faca4b8008c08a"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.980628 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" event={"ID":"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642","Type":"ContainerStarted","Data":"22c3f80ffcceb285435fe4c02c7883ff3e5b1cca3789815028d93d7834908c24"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.980705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" event={"ID":"fc4330e6-2bf8-4d0c-bae1-c0e1544a5642","Type":"ContainerStarted","Data":"9147af7daa52e946c45572b54fcedc1b7c849474cc494325ef9069afe19fe402"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.982309 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" event={"ID":"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0","Type":"ContainerStarted","Data":"3d8214de3cff32acd93c52946ec08e54f3b399ea6e6f37e7ef98b8b577412b09"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.982374 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" event={"ID":"e3dd4d9e-bd28-4db6-82ba-8c576f8e2de0","Type":"ContainerStarted","Data":"aad056fa1084aec2af7f67410ed19137feff31a9a718ff2b3b2a51c42fca7077"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.984365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2n2sj" event={"ID":"987cc233-91d0-4ed1-8d93-62e90e5e0925","Type":"ContainerStarted","Data":"22b21e85cff60593fcf78cf45e4440eee645a1f336a9129996ed785e9233e184"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.984397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2n2sj" event={"ID":"987cc233-91d0-4ed1-8d93-62e90e5e0925","Type":"ContainerStarted","Data":"0844e19f0fe083bec7faeabde4419f772c2c3f2e82fdd14fa0a8cc2349765f91"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.985172 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.987990 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-2n2sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.988042 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2n2sj" podUID="987cc233-91d0-4ed1-8d93-62e90e5e0925" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.988569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" event={"ID":"f5f9554d-08bc-4e4a-8178-50cdc07e89c6","Type":"ContainerStarted","Data":"0e664f002ebb09746518bbcdcb600a66f0fb73ac032a329c9a5a96fe8e10f9ce"} Dec 01 10:33:39 crc kubenswrapper[4909]: I1201 10:33:39.988607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" event={"ID":"f5f9554d-08bc-4e4a-8178-50cdc07e89c6","Type":"ContainerStarted","Data":"76cec18ad3b3550dc52aa0d74a0456660333f9cdb54bedd44cfa5b29f9309b69"} Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.020059 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfp5d\" (UniqueName: \"kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d\") pod \"controller-manager-879f6c89f-dtkzb\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.033272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r276z\" (UniqueName: \"kubernetes.io/projected/0491dd66-90bc-45d1-a72d-79dbe3f5711e-kube-api-access-r276z\") pod \"apiserver-76f77b778f-m2qzl\" (UID: \"0491dd66-90bc-45d1-a72d-79dbe3f5711e\") " pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.055903 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.057646 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtt6p\" (UniqueName: \"kubernetes.io/projected/18a8084c-5f3b-469c-8dfc-35f67bd9a0c4-kube-api-access-rtt6p\") pod \"apiserver-7bbb656c7d-lptv4\" (UID: \"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.076455 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.112827 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4crbr\" (UniqueName: \"kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr\") pod \"route-controller-manager-6576b87f9c-j6zbf\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.132621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce512d0d-7510-49cf-af98-c09f19031ab1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-msh6q\" (UID: \"ce512d0d-7510-49cf-af98-c09f19031ab1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.137530 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.147576 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.159365 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.160284 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.165270 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.177054 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.196546 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.215243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.216864 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.239341 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.259187 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.261141 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.276815 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.300555 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.316491 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.336433 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.358950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.378049 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.394498 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m2qzl"] Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.395671 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.416239 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.429234 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q"] Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.438530 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.457463 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.476423 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.478594 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4"] Dec 01 10:33:40 crc kubenswrapper[4909]: W1201 10:33:40.512999 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a8084c_5f3b_469c_8dfc_35f67bd9a0c4.slice/crio-7a50a47ec430d46e285f67f4a8e9f1ee93316ea48fd779ade5fb45dae402ee13 WatchSource:0}: Error finding container 7a50a47ec430d46e285f67f4a8e9f1ee93316ea48fd779ade5fb45dae402ee13: Status 404 returned error can't find the container with id 7a50a47ec430d46e285f67f4a8e9f1ee93316ea48fd779ade5fb45dae402ee13 Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.516706 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.517034 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.538297 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.557071 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.557395 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.576319 4909 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.591221 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:33:40 crc kubenswrapper[4909]: W1201 10:33:40.591358 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2c3a59_5dd8_4288_b7ac_47ff78abb6b8.slice/crio-cc18a219bba8e8289b2601fff0491ee707e25a0f2316dcb2581ef2f14fe296d9 WatchSource:0}: Error finding container cc18a219bba8e8289b2601fff0491ee707e25a0f2316dcb2581ef2f14fe296d9: Status 404 returned error can't find the container with id cc18a219bba8e8289b2601fff0491ee707e25a0f2316dcb2581ef2f14fe296d9 Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.595978 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.619998 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.656132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.679227 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6c7\" (UniqueName: \"kubernetes.io/projected/9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538-kube-api-access-5b6c7\") pod \"ingress-operator-5b745b69d9-xmccs\" (UID: \"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.697036 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.709843 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrh4\" (UniqueName: \"kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4\") pod \"oauth-openshift-558db77b4-hpz48\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.718164 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.736006 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.756167 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.776149 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.779408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.796420 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.817070 4909 request.go:700] Waited for 1.825844511s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.819189 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.836562 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.856665 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.877771 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.901170 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.950229 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs"] Dec 01 10:33:40 crc kubenswrapper[4909]: W1201 10:33:40.953786 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9028fad8_d0dc_4d8b_bbe0_1b7a9cd1f538.slice/crio-abf3684bd608ece3d40e5d07bde11568f3a4e4fa2af523f07e9059aa51503751 WatchSource:0}: Error finding container abf3684bd608ece3d40e5d07bde11568f3a4e4fa2af523f07e9059aa51503751: Status 404 returned error can't find the container with id abf3684bd608ece3d40e5d07bde11568f3a4e4fa2af523f07e9059aa51503751 Dec 01 10:33:40 crc kubenswrapper[4909]: I1201 10:33:40.992123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" event={"ID":"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4","Type":"ContainerStarted","Data":"7a50a47ec430d46e285f67f4a8e9f1ee93316ea48fd779ade5fb45dae402ee13"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.092911 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.093790 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.094166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.095464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.101555 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.101668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.101842 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.101906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.101955 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnmlf\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.104186 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:41.603964401 +0000 UTC m=+138.838435299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.114848 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" event={"ID":"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538","Type":"ContainerStarted","Data":"abf3684bd608ece3d40e5d07bde11568f3a4e4fa2af523f07e9059aa51503751"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.120846 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" event={"ID":"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8","Type":"ContainerStarted","Data":"cc18a219bba8e8289b2601fff0491ee707e25a0f2316dcb2581ef2f14fe296d9"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.125592 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" event={"ID":"0491dd66-90bc-45d1-a72d-79dbe3f5711e","Type":"ContainerStarted","Data":"7896f423ab7d05f8129ed22ddee5172aeab0790c40959de6f14515a4572ee36e"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.128160 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" event={"ID":"f8782986-5304-487f-962e-5b2e9233ab75","Type":"ContainerStarted","Data":"432d34971d4094e88a2190899f31113034076f54ae18369037abc57d19e79dee"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.138335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" event={"ID":"ce512d0d-7510-49cf-af98-c09f19031ab1","Type":"ContainerStarted","Data":"51c9983f1475e305a79c733bfe7c2424bbf7c666aa4c1608ee73331903769f9b"} Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.139373 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-2n2sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.139429 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2n2sj" podUID="987cc233-91d0-4ed1-8d93-62e90e5e0925" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.191010 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203622 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75033a4c-93b8-44c8-9456-437afb1b80ce-config\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c58fab6-7c2c-449d-9b3c-4f8414397cea-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmq2r\" (UniqueName: \"kubernetes.io/projected/5041fa62-cf08-4d18-8226-c8dd04437e14-kube-api-access-xmq2r\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmfp\" (UniqueName: \"kubernetes.io/projected/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-kube-api-access-kxmfp\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-images\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-images\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15547104-3163-44a2-9b36-4f4d0f3cde37-service-ca-bundle\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203917 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-metrics-certs\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.203967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-config\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204040 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c58fab6-7c2c-449d-9b3c-4f8414397cea-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204193 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5041fa62-cf08-4d18-8226-c8dd04437e14-serving-cert\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-config\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204244 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45af2286-dfcc-44d1-bfcc-66cffbc195f0-serving-cert\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-client\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204323 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-service-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204364 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204386 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spswr\" (UniqueName: \"kubernetes.io/projected/4cb1b32e-b43a-4261-b515-3a2c08feb70b-kube-api-access-spswr\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-stats-auth\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cb1b32e-b43a-4261-b515-3a2c08feb70b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85wd6\" (UniqueName: \"kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-service-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c58fab6-7c2c-449d-9b3c-4f8414397cea-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnd2\" (UniqueName: \"kubernetes.io/projected/ddb74535-f2d6-4818-a4c9-4876876d70cf-kube-api-access-rvnd2\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204595 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cb1b32e-b43a-4261-b515-3a2c08feb70b-proxy-tls\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204646 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-trusted-ca\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-default-certificate\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75033a4c-93b8-44c8-9456-437afb1b80ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204714 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh756\" (UniqueName: \"kubernetes.io/projected/45af2286-dfcc-44d1-bfcc-66cffbc195f0-kube-api-access-gh756\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204740 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-config\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204764 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75033a4c-93b8-44c8-9456-437afb1b80ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204807 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmlf\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204836 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbcca82-d1ed-481a-806f-cb338b5260da-serving-cert\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-serving-cert\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb74535-f2d6-4818-a4c9-4876876d70cf-metrics-tls\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.204965 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5hj\" (UniqueName: \"kubernetes.io/projected/ccbcca82-d1ed-481a-806f-cb338b5260da-kube-api-access-pt5hj\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205183 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thkn\" (UniqueName: \"kubernetes.io/projected/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-kube-api-access-6thkn\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205212 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5041fa62-cf08-4d18-8226-c8dd04437e14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-config\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205278 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fndh\" (UniqueName: \"kubernetes.io/projected/15547104-3163-44a2-9b36-4f4d0f3cde37-kube-api-access-9fndh\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-proxy-tls\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7ng\" (UniqueName: \"kubernetes.io/projected/a9fc5e65-280d-4313-99c8-6f32595347e9-kube-api-access-zz7ng\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205346 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205371 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.205422 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.205565 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:41.70553922 +0000 UTC m=+138.940010118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.213300 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.213756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.219722 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.220908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.224590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.225134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.271451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnmlf\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8927315f-7e49-4493-a6e2-34da6d18167f-tmpfs\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5041fa62-cf08-4d18-8226-c8dd04437e14-serving-cert\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306846 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-config\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306867 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-plugins-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb45j\" (UniqueName: \"kubernetes.io/projected/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-kube-api-access-sb45j\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.306965 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-certs\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307000 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-srv-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307038 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc72371b-fc8b-4050-afc6-deca5f398d3d-config-volume\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45af2286-dfcc-44d1-bfcc-66cffbc195f0-serving-cert\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307092 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-cabundle\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307111 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8qq\" (UniqueName: \"kubernetes.io/projected/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-kube-api-access-vq8qq\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpn9\" (UniqueName: \"kubernetes.io/projected/82e39ed8-5fa2-43f5-af61-a15fe62522c6-kube-api-access-kbpn9\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307158 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-client\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307178 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e922379-f723-4440-a90e-182b0917c969-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-service-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307233 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e293010-2e80-4435-a45f-d8011c7aa2fe-cert\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307250 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e39ed8-5fa2-43f5-af61-a15fe62522c6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307304 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spswr\" (UniqueName: \"kubernetes.io/projected/4cb1b32e-b43a-4261-b515-3a2c08feb70b-kube-api-access-spswr\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-stats-auth\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307381 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-node-bootstrap-token\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307840 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-mountpoint-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3df5462f-4575-4957-8e0a-8bdb27aeebca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cb1b32e-b43a-4261-b515-3a2c08feb70b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307947 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85wd6\" (UniqueName: \"kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-service-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.307995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c58fab6-7c2c-449d-9b3c-4f8414397cea-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnd2\" (UniqueName: \"kubernetes.io/projected/ddb74535-f2d6-4818-a4c9-4876876d70cf-kube-api-access-rvnd2\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308050 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25zlp\" (UniqueName: \"kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308102 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc72371b-fc8b-4050-afc6-deca5f398d3d-metrics-tls\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wc9n\" (UniqueName: \"kubernetes.io/projected/9e922379-f723-4440-a90e-182b0917c969-kube-api-access-8wc9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cb1b32e-b43a-4261-b515-3a2c08feb70b-proxy-tls\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308191 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308207 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-webhook-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308224 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-trusted-ca\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308258 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-default-certificate\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75033a4c-93b8-44c8-9456-437afb1b80ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308316 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae2ea64-3e52-4152-b919-c40d0d273f39-config\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh756\" (UniqueName: \"kubernetes.io/projected/45af2286-dfcc-44d1-bfcc-66cffbc195f0-kube-api-access-gh756\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308390 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-config\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75033a4c-93b8-44c8-9456-437afb1b80ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308452 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdpv\" (UniqueName: \"kubernetes.io/projected/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-kube-api-access-4rdpv\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6r4r\" (UniqueName: \"kubernetes.io/projected/4e293010-2e80-4435-a45f-d8011c7aa2fe-kube-api-access-q6r4r\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgtz\" (UniqueName: \"kubernetes.io/projected/e86961fb-83c5-4428-a0d8-f864f75fb581-kube-api-access-tlgtz\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308818 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mkb\" (UniqueName: \"kubernetes.io/projected/3df5462f-4575-4957-8e0a-8bdb27aeebca-kube-api-access-h6mkb\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e39ed8-5fa2-43f5-af61-a15fe62522c6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-profile-collector-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbcca82-d1ed-481a-806f-cb338b5260da-serving-cert\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.308985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-serving-cert\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thkn\" (UniqueName: \"kubernetes.io/projected/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-kube-api-access-6thkn\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5hj\" (UniqueName: \"kubernetes.io/projected/ccbcca82-d1ed-481a-806f-cb338b5260da-kube-api-access-pt5hj\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309048 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb74535-f2d6-4818-a4c9-4876876d70cf-metrics-tls\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5041fa62-cf08-4d18-8226-c8dd04437e14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-config\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fndh\" (UniqueName: \"kubernetes.io/projected/15547104-3163-44a2-9b36-4f4d0f3cde37-kube-api-access-9fndh\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309137 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-proxy-tls\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7ng\" (UniqueName: \"kubernetes.io/projected/a9fc5e65-280d-4313-99c8-6f32595347e9-kube-api-access-zz7ng\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309197 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-registration-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7p9\" (UniqueName: \"kubernetes.io/projected/cae2ea64-3e52-4152-b919-c40d0d273f39-kube-api-access-cn7p9\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkvz\" (UniqueName: \"kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309388 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309410 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-socket-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szjx\" (UniqueName: \"kubernetes.io/projected/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-kube-api-access-7szjx\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309496 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c58fab6-7c2c-449d-9b3c-4f8414397cea-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309546 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75033a4c-93b8-44c8-9456-437afb1b80ce-config\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-srv-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmq2r\" (UniqueName: \"kubernetes.io/projected/5041fa62-cf08-4d18-8226-c8dd04437e14-kube-api-access-xmq2r\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-csi-data-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmfp\" (UniqueName: \"kubernetes.io/projected/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-kube-api-access-kxmfp\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-images\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309749 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjjb\" (UniqueName: \"kubernetes.io/projected/da6c8746-9c5d-4990-b66f-cae50ffbc83f-kube-api-access-8bjjb\") pod \"migrator-59844c95c7-455s8\" (UID: \"da6c8746-9c5d-4990-b66f-cae50ffbc83f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-key\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309796 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cae2ea64-3e52-4152-b919-c40d0d273f39-serving-cert\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-images\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309945 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6cw\" (UniqueName: \"kubernetes.io/projected/fc72371b-fc8b-4050-afc6-deca5f398d3d-kube-api-access-vq6cw\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.309996 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15547104-3163-44a2-9b36-4f4d0f3cde37-service-ca-bundle\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-metrics-certs\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310048 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6c9\" (UniqueName: \"kubernetes.io/projected/8927315f-7e49-4493-a6e2-34da6d18167f-kube-api-access-9g6c9\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-config\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c58fab6-7c2c-449d-9b3c-4f8414397cea-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.310235 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bk5h\" (UniqueName: \"kubernetes.io/projected/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-kube-api-access-8bk5h\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.314678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5041fa62-cf08-4d18-8226-c8dd04437e14-serving-cert\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.315396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-config\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.317662 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.324725 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.325359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45af2286-dfcc-44d1-bfcc-66cffbc195f0-serving-cert\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.326130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75033a4c-93b8-44c8-9456-437afb1b80ce-config\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.326273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.327286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-images\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.328009 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.328019 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cb1b32e-b43a-4261-b515-3a2c08feb70b-proxy-tls\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.328331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-service-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.329451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-images\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.329893 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-trusted-ca\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.329987 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-serving-cert\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.330703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-config\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.330799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb74535-f2d6-4818-a4c9-4876876d70cf-metrics-tls\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.332736 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15547104-3163-44a2-9b36-4f4d0f3cde37-service-ca-bundle\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.332763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5041fa62-cf08-4d18-8226-c8dd04437e14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.333115 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-ca\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.334479 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c58fab6-7c2c-449d-9b3c-4f8414397cea-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.334939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-metrics-certs\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.335921 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af2286-dfcc-44d1-bfcc-66cffbc195f0-config\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.336411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.336477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9fc5e65-280d-4313-99c8-6f32595347e9-etcd-client\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.337176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc5e65-280d-4313-99c8-6f32595347e9-config\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.337744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75033a4c-93b8-44c8-9456-437afb1b80ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.338327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.338650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4cb1b32e-b43a-4261-b515-3a2c08feb70b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.339028 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:41.839004144 +0000 UTC m=+139.073475032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.339225 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccbcca82-d1ed-481a-806f-cb338b5260da-service-ca-bundle\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.341027 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ccbcca82-d1ed-481a-806f-cb338b5260da-serving-cert\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.341924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-default-certificate\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.343120 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.343604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.343647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/15547104-3163-44a2-9b36-4f4d0f3cde37-stats-auth\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.344482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c58fab6-7c2c-449d-9b3c-4f8414397cea-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.353909 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-proxy-tls\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.360183 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.380760 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7ng\" (UniqueName: \"kubernetes.io/projected/a9fc5e65-280d-4313-99c8-6f32595347e9-kube-api-access-zz7ng\") pod \"etcd-operator-b45778765-vh74j\" (UID: \"a9fc5e65-280d-4313-99c8-6f32595347e9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.396426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c58fab6-7c2c-449d-9b3c-4f8414397cea-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g7qpw\" (UID: \"4c58fab6-7c2c-449d-9b3c-4f8414397cea\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.402410 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.411810 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.411990 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:41.911932924 +0000 UTC m=+139.146403822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkvz\" (UniqueName: \"kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-socket-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412165 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7szjx\" (UniqueName: \"kubernetes.io/projected/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-kube-api-access-7szjx\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412185 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412205 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-srv-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-csi-data-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjjb\" (UniqueName: \"kubernetes.io/projected/da6c8746-9c5d-4990-b66f-cae50ffbc83f-kube-api-access-8bjjb\") pod \"migrator-59844c95c7-455s8\" (UID: \"da6c8746-9c5d-4990-b66f-cae50ffbc83f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-key\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412293 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cae2ea64-3e52-4152-b919-c40d0d273f39-serving-cert\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6cw\" (UniqueName: \"kubernetes.io/projected/fc72371b-fc8b-4050-afc6-deca5f398d3d-kube-api-access-vq6cw\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6c9\" (UniqueName: \"kubernetes.io/projected/8927315f-7e49-4493-a6e2-34da6d18167f-kube-api-access-9g6c9\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412391 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bk5h\" (UniqueName: \"kubernetes.io/projected/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-kube-api-access-8bk5h\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8927315f-7e49-4493-a6e2-34da6d18167f-tmpfs\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-plugins-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412446 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb45j\" (UniqueName: \"kubernetes.io/projected/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-kube-api-access-sb45j\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412467 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-certs\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-srv-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412502 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412519 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc72371b-fc8b-4050-afc6-deca5f398d3d-config-volume\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412536 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-cabundle\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8qq\" (UniqueName: \"kubernetes.io/projected/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-kube-api-access-vq8qq\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpn9\" (UniqueName: \"kubernetes.io/projected/82e39ed8-5fa2-43f5-af61-a15fe62522c6-kube-api-access-kbpn9\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412595 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e922379-f723-4440-a90e-182b0917c969-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412636 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e293010-2e80-4435-a45f-d8011c7aa2fe-cert\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e39ed8-5fa2-43f5-af61-a15fe62522c6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412722 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-node-bootstrap-token\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-mountpoint-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412759 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3df5462f-4575-4957-8e0a-8bdb27aeebca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25zlp\" (UniqueName: \"kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc72371b-fc8b-4050-afc6-deca5f398d3d-metrics-tls\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wc9n\" (UniqueName: \"kubernetes.io/projected/9e922379-f723-4440-a90e-182b0917c969-kube-api-access-8wc9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-webhook-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae2ea64-3e52-4152-b919-c40d0d273f39-config\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.412991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdpv\" (UniqueName: \"kubernetes.io/projected/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-kube-api-access-4rdpv\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6r4r\" (UniqueName: \"kubernetes.io/projected/4e293010-2e80-4435-a45f-d8011c7aa2fe-kube-api-access-q6r4r\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413030 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgtz\" (UniqueName: \"kubernetes.io/projected/e86961fb-83c5-4428-a0d8-f864f75fb581-kube-api-access-tlgtz\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413049 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mkb\" (UniqueName: \"kubernetes.io/projected/3df5462f-4575-4957-8e0a-8bdb27aeebca-kube-api-access-h6mkb\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e39ed8-5fa2-43f5-af61-a15fe62522c6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-profile-collector-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-registration-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7p9\" (UniqueName: \"kubernetes.io/projected/cae2ea64-3e52-4152-b919-c40d0d273f39-kube-api-access-cn7p9\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.413761 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-socket-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.415689 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-plugins-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.416845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-csi-data-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.418598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8927315f-7e49-4493-a6e2-34da6d18167f-tmpfs\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.419016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae2ea64-3e52-4152-b919-c40d0d273f39-config\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.420132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-mountpoint-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.420893 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e39ed8-5fa2-43f5-af61-a15fe62522c6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.421629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e293010-2e80-4435-a45f-d8011c7aa2fe-cert\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.421971 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e39ed8-5fa2-43f5-af61-a15fe62522c6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.422842 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cae2ea64-3e52-4152-b919-c40d0d273f39-serving-cert\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.423501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc72371b-fc8b-4050-afc6-deca5f398d3d-metrics-tls\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.423715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.423868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-registration-dir\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.425526 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:41.925500171 +0000 UTC m=+139.159971259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.425627 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.425688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-cabundle\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.425832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.426145 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3df5462f-4575-4957-8e0a-8bdb27aeebca-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.426737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.427198 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-srv-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.431579 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.431691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-node-bootstrap-token\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.433196 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc72371b-fc8b-4050-afc6-deca5f398d3d-config-volume\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.433655 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmq2r\" (UniqueName: \"kubernetes.io/projected/5041fa62-cf08-4d18-8226-c8dd04437e14-kube-api-access-xmq2r\") pod \"openshift-config-operator-7777fb866f-ck6xv\" (UID: \"5041fa62-cf08-4d18-8226-c8dd04437e14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.434775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.439623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-profile-collector-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.447637 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-srv-cert\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.447772 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e86961fb-83c5-4428-a0d8-f864f75fb581-certs\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.448499 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.460784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-signing-key\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.460865 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e922379-f723-4440-a90e-182b0917c969-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.460964 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8927315f-7e49-4493-a6e2-34da6d18167f-webhook-cert\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.461154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmfp\" (UniqueName: \"kubernetes.io/projected/6744f6c5-4a6c-4188-b4c4-0a25aac51b1d-kube-api-access-kxmfp\") pod \"machine-api-operator-5694c8668f-5wkg2\" (UID: \"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.464773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thkn\" (UniqueName: \"kubernetes.io/projected/4fb9ef38-4036-4c8b-8ba6-a5535f39d12c-kube-api-access-6thkn\") pod \"machine-config-operator-74547568cd-j597g\" (UID: \"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.479032 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5hj\" (UniqueName: \"kubernetes.io/projected/ccbcca82-d1ed-481a-806f-cb338b5260da-kube-api-access-pt5hj\") pod \"authentication-operator-69f744f599-fklv5\" (UID: \"ccbcca82-d1ed-481a-806f-cb338b5260da\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.498699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spswr\" (UniqueName: \"kubernetes.io/projected/4cb1b32e-b43a-4261-b515-3a2c08feb70b-kube-api-access-spswr\") pod \"machine-config-controller-84d6567774-4gg8d\" (UID: \"4cb1b32e-b43a-4261-b515-3a2c08feb70b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.514676 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.514857 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.01482401 +0000 UTC m=+139.249294918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.515088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.515520 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.015502535 +0000 UTC m=+139.249973433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.518636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fndh\" (UniqueName: \"kubernetes.io/projected/15547104-3163-44a2-9b36-4f4d0f3cde37-kube-api-access-9fndh\") pod \"router-default-5444994796-2q24w\" (UID: \"15547104-3163-44a2-9b36-4f4d0f3cde37\") " pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.532259 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh756\" (UniqueName: \"kubernetes.io/projected/45af2286-dfcc-44d1-bfcc-66cffbc195f0-kube-api-access-gh756\") pod \"console-operator-58897d9998-2pjzb\" (UID: \"45af2286-dfcc-44d1-bfcc-66cffbc195f0\") " pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.552794 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75033a4c-93b8-44c8-9456-437afb1b80ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nm8z2\" (UID: \"75033a4c-93b8-44c8-9456-437afb1b80ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.560039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.570190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.581645 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85wd6\" (UniqueName: \"kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6\") pod \"console-f9d7485db-wfgm2\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.596630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnd2\" (UniqueName: \"kubernetes.io/projected/ddb74535-f2d6-4818-a4c9-4876876d70cf-kube-api-access-rvnd2\") pod \"dns-operator-744455d44c-h7pgb\" (UID: \"ddb74535-f2d6-4818-a4c9-4876876d70cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.601996 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.618160 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.618789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.618999 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.118955861 +0000 UTC m=+139.353426759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.619489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.619936 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.119917676 +0000 UTC m=+139.354388574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.654522 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7p9\" (UniqueName: \"kubernetes.io/projected/cae2ea64-3e52-4152-b919-c40d0d273f39-kube-api-access-cn7p9\") pod \"service-ca-operator-777779d784-cqnc4\" (UID: \"cae2ea64-3e52-4152-b919-c40d0d273f39\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.672984 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.674302 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.681046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkvz\" (UniqueName: \"kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz\") pod \"collect-profiles-29409750-j469h\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.684134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szjx\" (UniqueName: \"kubernetes.io/projected/ddaf67bb-df33-49ef-b65e-e77eb630f5e5-kube-api-access-7szjx\") pod \"multus-admission-controller-857f4d67dd-w927p\" (UID: \"ddaf67bb-df33-49ef-b65e-e77eb630f5e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.689280 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.705991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjjb\" (UniqueName: \"kubernetes.io/projected/da6c8746-9c5d-4990-b66f-cae50ffbc83f-kube-api-access-8bjjb\") pod \"migrator-59844c95c7-455s8\" (UID: \"da6c8746-9c5d-4990-b66f-cae50ffbc83f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.728029 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.731297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6c9\" (UniqueName: \"kubernetes.io/projected/8927315f-7e49-4493-a6e2-34da6d18167f-kube-api-access-9g6c9\") pod \"packageserver-d55dfcdfc-8sbn7\" (UID: \"8927315f-7e49-4493-a6e2-34da6d18167f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.740181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6cw\" (UniqueName: \"kubernetes.io/projected/fc72371b-fc8b-4050-afc6-deca5f398d3d-kube-api-access-vq6cw\") pod \"dns-default-k5k4b\" (UID: \"fc72371b-fc8b-4050-afc6-deca5f398d3d\") " pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.743296 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.743632 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.243593709 +0000 UTC m=+139.478064607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.744317 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.744528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.745268 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.245252358 +0000 UTC m=+139.479723256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.752401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.776326 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.777756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bk5h\" (UniqueName: \"kubernetes.io/projected/06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0-kube-api-access-8bk5h\") pod \"service-ca-9c57cc56f-zmhkr\" (UID: \"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0\") " pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.779292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25zlp\" (UniqueName: \"kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp\") pod \"marketplace-operator-79b997595-7cd6g\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.780640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.795206 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.797705 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.807842 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mkb\" (UniqueName: \"kubernetes.io/projected/3df5462f-4575-4957-8e0a-8bdb27aeebca-kube-api-access-h6mkb\") pod \"package-server-manager-789f6589d5-67lbv\" (UID: \"3df5462f-4575-4957-8e0a-8bdb27aeebca\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.815398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb45j\" (UniqueName: \"kubernetes.io/projected/f992ea3b-b39a-48c5-84d5-67a8b6efcbd1-kube-api-access-sb45j\") pod \"olm-operator-6b444d44fb-shvkd\" (UID: \"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.827641 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.842457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.845346 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.846206 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.346187884 +0000 UTC m=+139.580658772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.847237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wc9n\" (UniqueName: \"kubernetes.io/projected/9e922379-f723-4440-a90e-182b0917c969-kube-api-access-8wc9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-gtxhz\" (UID: \"9e922379-f723-4440-a90e-182b0917c969\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.859927 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.860032 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.861059 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.864698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6r4r\" (UniqueName: \"kubernetes.io/projected/4e293010-2e80-4435-a45f-d8011c7aa2fe-kube-api-access-q6r4r\") pod \"ingress-canary-tgkkj\" (UID: \"4e293010-2e80-4435-a45f-d8011c7aa2fe\") " pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.908959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdpv\" (UniqueName: \"kubernetes.io/projected/f8fe7db3-57ec-46e1-9cf8-ed1d429ec342-kube-api-access-4rdpv\") pod \"csi-hostpathplugin-dgdsb\" (UID: \"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342\") " pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.926492 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.954069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.958731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8qq\" (UniqueName: \"kubernetes.io/projected/6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3-kube-api-access-vq8qq\") pod \"catalog-operator-68c6474976-mz2g7\" (UID: \"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.958913 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" Dec 01 10:33:41 crc kubenswrapper[4909]: I1201 10:33:41.962738 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpn9\" (UniqueName: \"kubernetes.io/projected/82e39ed8-5fa2-43f5-af61-a15fe62522c6-kube-api-access-kbpn9\") pod \"kube-storage-version-migrator-operator-b67b599dd-b56dj\" (UID: \"82e39ed8-5fa2-43f5-af61-a15fe62522c6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:41 crc kubenswrapper[4909]: E1201 10:33:41.972673 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.472644027 +0000 UTC m=+139.707114925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.006626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgtz\" (UniqueName: \"kubernetes.io/projected/e86961fb-83c5-4428-a0d8-f864f75fb581-kube-api-access-tlgtz\") pod \"machine-config-server-dkdgx\" (UID: \"e86961fb-83c5-4428-a0d8-f864f75fb581\") " pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.018282 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dkdgx" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.037557 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.037848 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.045215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tgkkj" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.055698 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.061084 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.061678 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.561661235 +0000 UTC m=+139.796132133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.070408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.124970 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.176117 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.176546 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.676527491 +0000 UTC m=+139.910998389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.176681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" event={"ID":"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8","Type":"ContainerStarted","Data":"c9b659489dc8280cb0a167868b4067c8ba37afe2d1776f8e45d21527675af945"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.177598 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.230975 4909 generic.go:334] "Generic (PLEG): container finished" podID="0491dd66-90bc-45d1-a72d-79dbe3f5711e" containerID="c85c6dfad7e2aaebe6a60f3791fc9bf3f96b5447465bfd07de0c9bdb40036b46" exitCode=0 Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.231416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" event={"ID":"0491dd66-90bc-45d1-a72d-79dbe3f5711e","Type":"ContainerDied","Data":"c85c6dfad7e2aaebe6a60f3791fc9bf3f96b5447465bfd07de0c9bdb40036b46"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.259743 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.292127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2q24w" event={"ID":"15547104-3163-44a2-9b36-4f4d0f3cde37","Type":"ContainerStarted","Data":"33fa37e876f6a95a9ddfab2543bc90db4a71aed29243e388d985608b986ab0fd"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.292724 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.293383 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.793357268 +0000 UTC m=+140.027828176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.293704 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.307981 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.807918111 +0000 UTC m=+140.042389009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.380465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" event={"ID":"be292a43-f2dc-44e5-8d2f-5b540b79ff6a","Type":"ContainerStarted","Data":"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.380528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" event={"ID":"be292a43-f2dc-44e5-8d2f-5b540b79ff6a","Type":"ContainerStarted","Data":"428fbe4e4f0563cf1f36948b6b4a0eb92c866c0f294c2349fd7c8e65dd00c525"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.381445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.389695 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5wkg2"] Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.397335 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.404863 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:42.904792541 +0000 UTC m=+140.139263469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.416273 4909 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hpz48 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.416352 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.421242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" event={"ID":"f8782986-5304-487f-962e-5b2e9233ab75","Type":"ContainerStarted","Data":"35115323614a45c76dd80f9a42b94f05f05b1a84cbb75c1b34f345e5c4976b10"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.422638 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.443583 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" event={"ID":"ce512d0d-7510-49cf-af98-c09f19031ab1","Type":"ContainerStarted","Data":"422eead14beb65b39de1e9422bb7b8229b3fd5b9791615f172056e4eeb2a1f16"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.461164 4909 generic.go:334] "Generic (PLEG): container finished" podID="18a8084c-5f3b-469c-8dfc-35f67bd9a0c4" containerID="80ec752b386adf86103b9160dc83fef334745527e81cddac4e9ab27b7117c6ac" exitCode=0 Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.461285 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" event={"ID":"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4","Type":"ContainerDied","Data":"80ec752b386adf86103b9160dc83fef334745527e81cddac4e9ab27b7117c6ac"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.500497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" event={"ID":"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538","Type":"ContainerStarted","Data":"9c657d97d43f7755cc29bb77300f66d98418fa4a27621293dd214c8c70deed9d"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.500595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" event={"ID":"9028fad8-d0dc-4d8b-bbe0-1b7a9cd1f538","Type":"ContainerStarted","Data":"07a584dbdc0bb0ec0d6ab356fa2ce7b30faf45d785d138f8cd26dd3022bf7404"} Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.500666 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.501164 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-2n2sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.501214 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2n2sj" podUID="987cc233-91d0-4ed1-8d93-62e90e5e0925" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.544248 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.549751 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.049728138 +0000 UTC m=+140.284199056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.555630 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fklv5"] Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.684012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.685894 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.185848878 +0000 UTC m=+140.420319836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.790452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.790786 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.290774188 +0000 UTC m=+140.525245086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.815673 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d9vlp" podStartSLOduration=120.815652261 podStartE2EDuration="2m0.815652261s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:42.770657525 +0000 UTC m=+140.005128433" watchObservedRunningTime="2025-12-01 10:33:42.815652261 +0000 UTC m=+140.050123159" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.850986 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2m6qp" podStartSLOduration=120.85096602 podStartE2EDuration="2m0.85096602s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:42.817524528 +0000 UTC m=+140.051995426" watchObservedRunningTime="2025-12-01 10:33:42.85096602 +0000 UTC m=+140.085436918" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.852134 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-644kz" podStartSLOduration=120.852128082 podStartE2EDuration="2m0.852128082s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:42.849398344 +0000 UTC m=+140.083869262" watchObservedRunningTime="2025-12-01 10:33:42.852128082 +0000 UTC m=+140.086598980" Dec 01 10:33:42 crc kubenswrapper[4909]: I1201 10:33:42.891660 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:42 crc kubenswrapper[4909]: E1201 10:33:42.891992 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.391973842 +0000 UTC m=+140.626444740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:42.993822 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:42.994199 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.494186165 +0000 UTC m=+140.728657063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:42.999865 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2n2sj" podStartSLOduration=120.999847938 podStartE2EDuration="2m0.999847938s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:42.998630094 +0000 UTC m=+140.233100992" watchObservedRunningTime="2025-12-01 10:33:42.999847938 +0000 UTC m=+140.234318836" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.094943 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.095460 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.595439442 +0000 UTC m=+140.829910340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.118148 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-msh6q" podStartSLOduration=121.118123667 podStartE2EDuration="2m1.118123667s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.034251134 +0000 UTC m=+140.268722032" watchObservedRunningTime="2025-12-01 10:33:43.118123667 +0000 UTC m=+140.352594565" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.134488 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.134546 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vh74j"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.139181 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.162961 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xmccs" podStartSLOduration=121.162947918 podStartE2EDuration="2m1.162947918s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.161888829 +0000 UTC m=+140.396359727" watchObservedRunningTime="2025-12-01 10:33:43.162947918 +0000 UTC m=+140.397418816" Dec 01 10:33:43 crc kubenswrapper[4909]: W1201 10:33:43.185683 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fc5e65_280d_4313_99c8_6f32595347e9.slice/crio-70c425dbc2593b4591c273830691b6c5844bbe147ee9c54c9911deb64b416360 WatchSource:0}: Error finding container 70c425dbc2593b4591c273830691b6c5844bbe147ee9c54c9911deb64b416360: Status 404 returned error can't find the container with id 70c425dbc2593b4591c273830691b6c5844bbe147ee9c54c9911deb64b416360 Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.205603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.205978 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.705966802 +0000 UTC m=+140.940437700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.239762 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7pgb"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.309442 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.309830 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.809807903 +0000 UTC m=+141.044278801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.323137 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" podStartSLOduration=121.32311321 podStartE2EDuration="2m1.32311321s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.315184736 +0000 UTC m=+140.549655634" watchObservedRunningTime="2025-12-01 10:33:43.32311321 +0000 UTC m=+140.557584108" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.339160 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.413588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.414073 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:43.914058538 +0000 UTC m=+141.148529436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.422171 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w8r4s" podStartSLOduration=121.422133248 podStartE2EDuration="2m1.422133248s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.411548158 +0000 UTC m=+140.646019056" watchObservedRunningTime="2025-12-01 10:33:43.422133248 +0000 UTC m=+140.656604146" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.483928 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j597g"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.498010 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" podStartSLOduration=120.497986983 podStartE2EDuration="2m0.497986983s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.462766558 +0000 UTC m=+140.697237456" watchObservedRunningTime="2025-12-01 10:33:43.497986983 +0000 UTC m=+140.732457881" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.515041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.519735 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.019713873 +0000 UTC m=+141.254184771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.526195 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dkdgx" event={"ID":"e86961fb-83c5-4428-a0d8-f864f75fb581","Type":"ContainerStarted","Data":"c7145c375d9c60915b855173446de96cfdb7413a24ce437040fef64b6712910c"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.526262 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dkdgx" event={"ID":"e86961fb-83c5-4428-a0d8-f864f75fb581","Type":"ContainerStarted","Data":"58842db9c04f9c4fbcbf51b607b2f9310af174abcc97a633479d3b45ee89fd90"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.530519 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2pjzb"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.535412 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.537850 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" event={"ID":"5041fa62-cf08-4d18-8226-c8dd04437e14","Type":"ContainerStarted","Data":"8cf4acaaf61f548dd9008701191b2931e3bd38131c4244177951bd85056c3493"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.550821 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w927p"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.552585 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" event={"ID":"a9fc5e65-280d-4313-99c8-6f32595347e9","Type":"ContainerStarted","Data":"70c425dbc2593b4591c273830691b6c5844bbe147ee9c54c9911deb64b416360"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.563747 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d"] Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.563851 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" event={"ID":"8927315f-7e49-4493-a6e2-34da6d18167f","Type":"ContainerStarted","Data":"d2b73c069c6e44dae122a1be91d63266147e46988aa403646e5147f065309a3f"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.611279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" event={"ID":"ccbcca82-d1ed-481a-806f-cb338b5260da","Type":"ContainerStarted","Data":"53520292673bb730e0d108df505bdd1f11ae06b833da6e82f21e47849f1dbd96"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.611363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" event={"ID":"ccbcca82-d1ed-481a-806f-cb338b5260da","Type":"ContainerStarted","Data":"56efb69b5548318e94ce5af50a9e394a5093f6d093c08f4091849f9154ff866b"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.621251 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.621597 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.121584453 +0000 UTC m=+141.356055351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.622118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" event={"ID":"ddb74535-f2d6-4818-a4c9-4876876d70cf","Type":"ContainerStarted","Data":"6098263b42a4f0464e93674d05347dd47901aef6a988e0fa818e8c625e3ec618"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.625887 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" event={"ID":"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d","Type":"ContainerStarted","Data":"e6e11d5da5670ab84e72469d997b8a1f38db5e682a8c50cabd81ec3dd5065478"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.625941 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" event={"ID":"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d","Type":"ContainerStarted","Data":"53398c578d00a59c00ee65b832260f7daea6a77531910c4e49806ac02e0a841a"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.636997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2q24w" event={"ID":"15547104-3163-44a2-9b36-4f4d0f3cde37","Type":"ContainerStarted","Data":"069cab4da129a27825fdad773cd65114f9943d4d0b9b863dabf2cf664d24d5f5"} Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.645474 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.665026 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" podStartSLOduration=121.664991972 podStartE2EDuration="2m1.664991972s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.658134176 +0000 UTC m=+140.892605084" watchObservedRunningTime="2025-12-01 10:33:43.664991972 +0000 UTC m=+140.899462880" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.674940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.725548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.729598 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.229573832 +0000 UTC m=+141.464044730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.829051 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.835340 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.335288449 +0000 UTC m=+141.569759347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.853830 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:43 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:43 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:43 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.858914 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:43 crc kubenswrapper[4909]: I1201 10:33:43.940803 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:43 crc kubenswrapper[4909]: E1201 10:33:43.943818 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.443791038 +0000 UTC m=+141.678261936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.017596 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hw7wm" podStartSLOduration=123.017573948 podStartE2EDuration="2m3.017573948s" podCreationTimestamp="2025-12-01 10:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:43.932108618 +0000 UTC m=+141.166579536" watchObservedRunningTime="2025-12-01 10:33:44.017573948 +0000 UTC m=+141.252044866" Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.030812 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.046676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.047348 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.547328707 +0000 UTC m=+141.781799605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.065724 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dgdsb"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.150221 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.150623 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.650606837 +0000 UTC m=+141.885077735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.161560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.161997 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.661982466 +0000 UTC m=+141.896453354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.196605 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fklv5" podStartSLOduration=122.196583778 podStartE2EDuration="2m2.196583778s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:44.194799955 +0000 UTC m=+141.429270863" watchObservedRunningTime="2025-12-01 10:33:44.196583778 +0000 UTC m=+141.431054676" Dec 01 10:33:44 crc kubenswrapper[4909]: W1201 10:33:44.249301 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fe7db3_57ec_46e1_9cf8_ed1d429ec342.slice/crio-7da83e7f38b5c0a36c1d284395de0ca820d5063c28cc8ffbef4d46c59349757e WatchSource:0}: Error finding container 7da83e7f38b5c0a36c1d284395de0ca820d5063c28cc8ffbef4d46c59349757e: Status 404 returned error can't find the container with id 7da83e7f38b5c0a36c1d284395de0ca820d5063c28cc8ffbef4d46c59349757e Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.266445 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dkdgx" podStartSLOduration=6.266426657 podStartE2EDuration="6.266426657s" podCreationTimestamp="2025-12-01 10:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:44.256419908 +0000 UTC m=+141.490890816" watchObservedRunningTime="2025-12-01 10:33:44.266426657 +0000 UTC m=+141.500897555" Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.267632 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.268349 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.768312846 +0000 UTC m=+142.002783744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.363546 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.374380 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.374723 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.874711228 +0000 UTC m=+142.109182126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.399963 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zmhkr"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.408733 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.409570 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.411698 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.484418 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.484830 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:44.984813762 +0000 UTC m=+142.219284660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.588329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.588736 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.088722435 +0000 UTC m=+142.323193333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.589373 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2q24w" podStartSLOduration=122.589358488 podStartE2EDuration="2m2.589358488s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:44.569808626 +0000 UTC m=+141.804279524" watchObservedRunningTime="2025-12-01 10:33:44.589358488 +0000 UTC m=+141.823829386" Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.608832 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.678494 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.689937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.690307 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.190288634 +0000 UTC m=+142.424759542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.699043 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:44 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:44 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:44 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.699092 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.762659 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.771809 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" event={"ID":"da6c8746-9c5d-4990-b66f-cae50ffbc83f","Type":"ContainerStarted","Data":"c0e73d6f6b436c161f4beb7f1c211e16a8ebf154f1f2610f8b3559ef77011438"} Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.793095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.793482 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.29346112 +0000 UTC m=+142.527932018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.851581 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" event={"ID":"ddb74535-f2d6-4818-a4c9-4876876d70cf","Type":"ContainerStarted","Data":"c94ce93876214344f64edf9925e3eedee16868129486695006f2b561a1ec5b29"} Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.880968 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k5k4b"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.882752 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.891739 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tgkkj"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.895940 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7"] Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.896525 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:44 crc kubenswrapper[4909]: E1201 10:33:44.896958 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.396867605 +0000 UTC m=+142.631338503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.916032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" event={"ID":"4c58fab6-7c2c-449d-9b3c-4f8414397cea","Type":"ContainerStarted","Data":"6cee65c4c29a40f25dd27cea9681016ac696872d85a2083a1de99d0a5fa23ea3"} Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.942709 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" event={"ID":"5041fa62-cf08-4d18-8226-c8dd04437e14","Type":"ContainerStarted","Data":"0ff3205d4a6ee693cd1228c86832e4a44287133730fc8c0d5303089347aec138"} Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.962401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" event={"ID":"45af2286-dfcc-44d1-bfcc-66cffbc195f0","Type":"ContainerStarted","Data":"f48c6be8071e176c113e05d870b32c95484c38f50e4259b8cefeae3602c29da8"} Dec 01 10:33:44 crc kubenswrapper[4909]: I1201 10:33:44.982684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" event={"ID":"6744f6c5-4a6c-4188-b4c4-0a25aac51b1d","Type":"ContainerStarted","Data":"1cc69ed58de26b960ab287860da6b3433ff750e3eb42fbaf394625d2a03b7493"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.001444 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.002109 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.502059054 +0000 UTC m=+142.736529942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.005856 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" event={"ID":"a9fc5e65-280d-4313-99c8-6f32595347e9","Type":"ContainerStarted","Data":"eb72855c1b2ce08a23dd53b956be73020ceb02c40eaaf6426548a5950562174c"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.015802 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5wkg2" podStartSLOduration=122.015783647 podStartE2EDuration="2m2.015783647s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:45.014527022 +0000 UTC m=+142.248997920" watchObservedRunningTime="2025-12-01 10:33:45.015783647 +0000 UTC m=+142.250254535" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.057935 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-vh74j" podStartSLOduration=123.057908511 podStartE2EDuration="2m3.057908511s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:45.057465625 +0000 UTC m=+142.291936543" watchObservedRunningTime="2025-12-01 10:33:45.057908511 +0000 UTC m=+142.292379429" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.069626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" event={"ID":"8927315f-7e49-4493-a6e2-34da6d18167f","Type":"ContainerStarted","Data":"7eeae12187d8fe234e1039b62064454300e7605fa50d142d99e743351da29da1"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.070171 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.098088 4909 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8sbn7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.098175 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" podUID="8927315f-7e49-4493-a6e2-34da6d18167f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.101046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" event={"ID":"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c","Type":"ContainerStarted","Data":"cddbda31ff831e6324b02fedac5263382a22fdb5aad31e0830c9202b296ee3ed"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.102332 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.104240 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.604215284 +0000 UTC m=+142.838686182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.111101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" event={"ID":"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342","Type":"ContainerStarted","Data":"7da83e7f38b5c0a36c1d284395de0ca820d5063c28cc8ffbef4d46c59349757e"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.119835 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" podStartSLOduration=122.118901852 podStartE2EDuration="2m2.118901852s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:45.108680905 +0000 UTC m=+142.343151803" watchObservedRunningTime="2025-12-01 10:33:45.118901852 +0000 UTC m=+142.353372740" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.126486 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" event={"ID":"0491dd66-90bc-45d1-a72d-79dbe3f5711e","Type":"ContainerStarted","Data":"f2d1c7cd50f93db996e4c72c1b4288173d8689ed0cdf4331901b696882870720"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.162349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" event={"ID":"18a8084c-5f3b-469c-8dfc-35f67bd9a0c4","Type":"ContainerStarted","Data":"2d71ba9eec538a0a10286b331fd02b9e654483486e8cd5a630118ad1f0bdee91"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.170616 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" event={"ID":"d6190ef7-3deb-4bd9-9c73-109572e871d1","Type":"ContainerStarted","Data":"e2c03736b389ef96455059846303653cd2c26d7c9e2fbcf0d6d9f44796d90f37"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.177814 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" event={"ID":"ddaf67bb-df33-49ef-b65e-e77eb630f5e5","Type":"ContainerStarted","Data":"9478385caf2e87f2994af9859904d9edb511f5916be7d8c94b86110ca1d4465f"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.184258 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" event={"ID":"4cb1b32e-b43a-4261-b515-3a2c08feb70b","Type":"ContainerStarted","Data":"e07af2605ba1b92149b45c69ca0e1c9f1d451962b6357f709f152bdb579dfbc1"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.184335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" event={"ID":"4cb1b32e-b43a-4261-b515-3a2c08feb70b","Type":"ContainerStarted","Data":"33392a606d93a090a31728adbeb7582128400e2d363cb4088066b1fc1b696dbb"} Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.202451 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" podStartSLOduration=122.202433192 podStartE2EDuration="2m2.202433192s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:45.201514869 +0000 UTC m=+142.435985767" watchObservedRunningTime="2025-12-01 10:33:45.202433192 +0000 UTC m=+142.436904090" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.205854 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.208623 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.708600674 +0000 UTC m=+142.943071782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.306753 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.307042 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.807003629 +0000 UTC m=+143.041474527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.307339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.309822 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.809799759 +0000 UTC m=+143.044270917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.409144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.409611 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:45.909593424 +0000 UTC m=+143.144064322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.510755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.511133 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.011116551 +0000 UTC m=+143.245587449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.611790 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.612051 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.112017666 +0000 UTC m=+143.346488564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.613365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.613788 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.113769429 +0000 UTC m=+143.348240327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.682848 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:45 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:45 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:45 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.682917 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.716937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.717110 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.21708321 +0000 UTC m=+143.451554108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.717231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.717729 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.217722363 +0000 UTC m=+143.452193261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.818831 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.819279 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.319261331 +0000 UTC m=+143.553732239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:45 crc kubenswrapper[4909]: I1201 10:33:45.920530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:45 crc kubenswrapper[4909]: E1201 10:33:45.921000 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.420981214 +0000 UTC m=+143.655452112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.026610 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.026765 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.526737724 +0000 UTC m=+143.761208622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.027306 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.027631 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.527618325 +0000 UTC m=+143.762089223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.129469 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.129679 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.629660311 +0000 UTC m=+143.864131209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.130033 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.130335 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.630327386 +0000 UTC m=+143.864798284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.237764 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.238081 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.738065665 +0000 UTC m=+143.972536553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.253637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tgkkj" event={"ID":"4e293010-2e80-4435-a45f-d8011c7aa2fe","Type":"ContainerStarted","Data":"0c9ca6ae07c2b0d82546e6ef1b6f83ee21096a59645c6153e71fed6f506c69bc"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.253689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tgkkj" event={"ID":"4e293010-2e80-4435-a45f-d8011c7aa2fe","Type":"ContainerStarted","Data":"9f0a7852e54e675480510264a5c149ab052f7e385002d29dd9557a3a8dfaacb3"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.280251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" event={"ID":"cae2ea64-3e52-4152-b919-c40d0d273f39","Type":"ContainerStarted","Data":"4757b532575c9e6342e373cc887b4708d9776f3ff36db0dc47d90454e605fc8e"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.280313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" event={"ID":"cae2ea64-3e52-4152-b919-c40d0d273f39","Type":"ContainerStarted","Data":"4ed36e7d69dfb961137c5f74bebe7a225b1f80da2f7572c1494b99961968a305"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.286837 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tgkkj" podStartSLOduration=8.286812197 podStartE2EDuration="8.286812197s" podCreationTimestamp="2025-12-01 10:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.283917103 +0000 UTC m=+143.518388001" watchObservedRunningTime="2025-12-01 10:33:46.286812197 +0000 UTC m=+143.521283115" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.307033 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" event={"ID":"9e922379-f723-4440-a90e-182b0917c969","Type":"ContainerStarted","Data":"8a39a64872d8de9dfe511f01df2151ec952a147b9b7f31bc4d53bd85e45b58ac"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.307079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" event={"ID":"9e922379-f723-4440-a90e-182b0917c969","Type":"ContainerStarted","Data":"f10217c717bdabb4dcfdea3615c955ab8ea200a014e2fb258763bccdcf5edf17"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.334684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" event={"ID":"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0","Type":"ContainerStarted","Data":"f292e89bff1adfbb8baf7a5e32683d03cd4587f8ad1412c3a65704fa25f72de9"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.334727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" event={"ID":"06b88e94-fb03-42ce-8ad6-e2e1dfd2fde0","Type":"ContainerStarted","Data":"100ceb0218c9b51d0d8a0cd1702e0781b1b73decdc4c28828a99599e5687034a"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.340206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.341677 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.841663357 +0000 UTC m=+144.076134255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.347127 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gtxhz" podStartSLOduration=123.347113133 podStartE2EDuration="2m3.347113133s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.34536791 +0000 UTC m=+143.579838808" watchObservedRunningTime="2025-12-01 10:33:46.347113133 +0000 UTC m=+143.581584031" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.348130 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cqnc4" podStartSLOduration=123.348124249 podStartE2EDuration="2m3.348124249s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.311300476 +0000 UTC m=+143.545771374" watchObservedRunningTime="2025-12-01 10:33:46.348124249 +0000 UTC m=+143.582595147" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.348599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" event={"ID":"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3","Type":"ContainerStarted","Data":"07541c98769a390ef5c93e209685eeca363748adc30439b32f83535c3e13ffdb"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.348657 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" event={"ID":"6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3","Type":"ContainerStarted","Data":"1c20fde81a64c4d8be758adba3bfc8f176fb2f6f6b0cd168c4c4d9733491e93d"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.349645 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.364724 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" event={"ID":"4cb1b32e-b43a-4261-b515-3a2c08feb70b","Type":"ContainerStarted","Data":"de9e28ffc5eae02ce1c4225407793b64f08be40bcc2eb746077a8250b051f3ad"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.373002 4909 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mz2g7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.373069 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" podUID="6d54983b-b2f2-4c7f-9bd8-27ebc8b4eca3" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.382049 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zmhkr" podStartSLOduration=123.382035418 podStartE2EDuration="2m3.382035418s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.380345626 +0000 UTC m=+143.614816524" watchObservedRunningTime="2025-12-01 10:33:46.382035418 +0000 UTC m=+143.616506316" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.416345 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wfgm2" event={"ID":"a366b491-4c3c-40a9-86a0-a82d686b1a15","Type":"ContainerStarted","Data":"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.416799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wfgm2" event={"ID":"a366b491-4c3c-40a9-86a0-a82d686b1a15","Type":"ContainerStarted","Data":"9bb9d4c1d5385aed3314778bb88d9cb86cea8ef02c20db0d8a114bfdac19d090"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.419003 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gg8d" podStartSLOduration=123.418985565 podStartE2EDuration="2m3.418985565s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.41606653 +0000 UTC m=+143.650537428" watchObservedRunningTime="2025-12-01 10:33:46.418985565 +0000 UTC m=+143.653456473" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.428603 4909 generic.go:334] "Generic (PLEG): container finished" podID="5041fa62-cf08-4d18-8226-c8dd04437e14" containerID="0ff3205d4a6ee693cd1228c86832e4a44287133730fc8c0d5303089347aec138" exitCode=0 Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.428702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" event={"ID":"5041fa62-cf08-4d18-8226-c8dd04437e14","Type":"ContainerDied","Data":"0ff3205d4a6ee693cd1228c86832e4a44287133730fc8c0d5303089347aec138"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.428735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" event={"ID":"5041fa62-cf08-4d18-8226-c8dd04437e14","Type":"ContainerStarted","Data":"dd423e4b94a0030db5f9d81fda80e985af7349ea29cafe02c1b09ce1fc87f9f1"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.429381 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.441554 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.442362 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:46.942344444 +0000 UTC m=+144.176815342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.445687 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" podStartSLOduration=123.445666743 podStartE2EDuration="2m3.445666743s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.442550441 +0000 UTC m=+143.677021339" watchObservedRunningTime="2025-12-01 10:33:46.445666743 +0000 UTC m=+143.680137641" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.458054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" event={"ID":"d6190ef7-3deb-4bd9-9c73-109572e871d1","Type":"ContainerStarted","Data":"a71c7aa8238a55a6461527a2dd182f2c7475657eb0842df727c47fdf1e8313a8"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.474709 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" podStartSLOduration=124.474690836 podStartE2EDuration="2m4.474690836s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.472962924 +0000 UTC m=+143.707433822" watchObservedRunningTime="2025-12-01 10:33:46.474690836 +0000 UTC m=+143.709161734" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.485211 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5k4b" event={"ID":"fc72371b-fc8b-4050-afc6-deca5f398d3d","Type":"ContainerStarted","Data":"db8b2057b4d7bf60d45a7d5156da7b9d870b527f51ff833ed8e4cbe881803b74"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.485291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5k4b" event={"ID":"fc72371b-fc8b-4050-afc6-deca5f398d3d","Type":"ContainerStarted","Data":"c35f5cae4e348b06014ae923cc895aaef217c92703194a87af05e33a80f174f6"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.486165 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.512697 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wfgm2" podStartSLOduration=124.512683091 podStartE2EDuration="2m4.512683091s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.5107078 +0000 UTC m=+143.745178708" watchObservedRunningTime="2025-12-01 10:33:46.512683091 +0000 UTC m=+143.747153979" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.546417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.547710 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.047696369 +0000 UTC m=+144.282167257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.551860 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" event={"ID":"0491dd66-90bc-45d1-a72d-79dbe3f5711e","Type":"ContainerStarted","Data":"f2eb104b222228e580b9bc3a6d164a6cc2146ce4ebe254764d5b65caff0ac3ad"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.589317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" event={"ID":"82e39ed8-5fa2-43f5-af61-a15fe62522c6","Type":"ContainerStarted","Data":"a99a8d810e0986d7b7b7e3e79afa1298536f9ae94e6bc92805394d3789e74d5b"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.589387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" event={"ID":"82e39ed8-5fa2-43f5-af61-a15fe62522c6","Type":"ContainerStarted","Data":"db5d6206f3dc55fb51512b80c60a5f44d9b88493164a8e4ee8411f33d840df76"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.599862 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" podStartSLOduration=124.599841142 podStartE2EDuration="2m4.599841142s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.580451205 +0000 UTC m=+143.814922123" watchObservedRunningTime="2025-12-01 10:33:46.599841142 +0000 UTC m=+143.834312040" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.619926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" event={"ID":"45af2286-dfcc-44d1-bfcc-66cffbc195f0","Type":"ContainerStarted","Data":"10fefdacc124079b57b3e1c58d1803dfd8cc7f2de30ddce34debdd5f4a1b150e"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.620944 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.628825 4909 patch_prober.go:28] interesting pod/console-operator-58897d9998-2pjzb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.628894 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" podUID="45af2286-dfcc-44d1-bfcc-66cffbc195f0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.647597 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.648658 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.148625385 +0000 UTC m=+144.383096343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.668694 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" event={"ID":"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c","Type":"ContainerStarted","Data":"dbb8384c41a0103932c4ad1e525f98c8b35c215afe26b88b016f70e1443225d7"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.668742 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" event={"ID":"4fb9ef38-4036-4c8b-8ba6-a5535f39d12c","Type":"ContainerStarted","Data":"dce437b533987933a8111b605a4bd9883191c4e92487eae68c18a64db35032d9"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.688080 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:46 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:46 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:46 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.688151 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.703838 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k5k4b" podStartSLOduration=8.703822817 podStartE2EDuration="8.703822817s" podCreationTimestamp="2025-12-01 10:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.629456505 +0000 UTC m=+143.863927403" watchObservedRunningTime="2025-12-01 10:33:46.703822817 +0000 UTC m=+143.938293715" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.704388 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b56dj" podStartSLOduration=123.704380628 podStartE2EDuration="2m3.704380628s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.702309202 +0000 UTC m=+143.936780100" watchObservedRunningTime="2025-12-01 10:33:46.704380628 +0000 UTC m=+143.938851526" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.706744 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" event={"ID":"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342","Type":"ContainerStarted","Data":"4623c00c61159445eaddeaeb3226bb33f11bae8a5ace91b0ca63e2900cad7648"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.709053 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" event={"ID":"ddaf67bb-df33-49ef-b65e-e77eb630f5e5","Type":"ContainerStarted","Data":"fe0f3931a7c1d76c201b76819e970aa79e6e40ffebb7f764b149e83b190c4d6d"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.710770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" event={"ID":"ac0e24f9-2a35-4c96-b694-472eab5c4f15","Type":"ContainerStarted","Data":"dc9e0b63004535ea8efd390db6ab59c3cd13b7dc2169145c58102485a011cd19"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.710800 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" event={"ID":"ac0e24f9-2a35-4c96-b694-472eab5c4f15","Type":"ContainerStarted","Data":"04510736f49555a97a838ac21ad54448dcaf67b28ad520bd569e391cf80f8c91"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.713926 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.725996 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7cd6g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.726053 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.744721 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" event={"ID":"da6c8746-9c5d-4990-b66f-cae50ffbc83f","Type":"ContainerStarted","Data":"af834fe9ecf0b1bbbee3430f9f5247b5a238d6245251f5ec8098c4e7e05cb937"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.750465 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.755382 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.255356388 +0000 UTC m=+144.489827286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.763278 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" podStartSLOduration=124.763260863 podStartE2EDuration="2m4.763260863s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.758710539 +0000 UTC m=+143.993181467" watchObservedRunningTime="2025-12-01 10:33:46.763260863 +0000 UTC m=+143.997731761" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.767505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" event={"ID":"3df5462f-4575-4957-8e0a-8bdb27aeebca","Type":"ContainerStarted","Data":"b1cc6cf25016b70b7b87f79cd5627e70937393528beeabdebe8d95d9477f6732"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.767545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" event={"ID":"3df5462f-4575-4957-8e0a-8bdb27aeebca","Type":"ContainerStarted","Data":"41240475c8d248a408f03c71889dee2a2ccdaa9dd3474c7162ab279c5142a8db"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.767555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" event={"ID":"3df5462f-4575-4957-8e0a-8bdb27aeebca","Type":"ContainerStarted","Data":"c158cdd642935336a3ad5b47883e4ab456f15878f5a5774dddd1d6756185291d"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.768000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.788720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" event={"ID":"4c58fab6-7c2c-449d-9b3c-4f8414397cea","Type":"ContainerStarted","Data":"f4730bf4952523205d90bf18e94fa832415b42efd7f6b414153b8aa79a904694"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.795713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" event={"ID":"75033a4c-93b8-44c8-9456-437afb1b80ce","Type":"ContainerStarted","Data":"2b8325be18ed29f13a4fca8fc16a5d4f7ad1a91b1275dddc03242390c7967645"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.795775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" event={"ID":"75033a4c-93b8-44c8-9456-437afb1b80ce","Type":"ContainerStarted","Data":"efdee060212a3590b180ed2378436be00d3c10177a93f9c7e4bad746d5df4973"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.798175 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" podStartSLOduration=124.798150556 podStartE2EDuration="2m4.798150556s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.794284377 +0000 UTC m=+144.028755295" watchObservedRunningTime="2025-12-01 10:33:46.798150556 +0000 UTC m=+144.032621454" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.817244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" event={"ID":"ddb74535-f2d6-4818-a4c9-4876876d70cf","Type":"ContainerStarted","Data":"37b934b7e7da77e85a5163fa7e9a8d350b0722ab5d9bef72dc822973a01ace4e"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.820328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" event={"ID":"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1","Type":"ContainerStarted","Data":"f8cbebf9365e37b304246816ff9ec75741cebfdad95d7adb0b3ab12d13f91b3a"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.820373 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.820385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" event={"ID":"f992ea3b-b39a-48c5-84d5-67a8b6efcbd1","Type":"ContainerStarted","Data":"7e6e37df31a87efab82c65ceb929e150f6eb35fd1b71136ef0a6f2a2624554fc"} Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.822297 4909 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-shvkd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.822336 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" podUID="f992ea3b-b39a-48c5-84d5-67a8b6efcbd1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.834966 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j597g" podStartSLOduration=123.834951958 podStartE2EDuration="2m3.834951958s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.8330597 +0000 UTC m=+144.067530598" watchObservedRunningTime="2025-12-01 10:33:46.834951958 +0000 UTC m=+144.069422856" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.851197 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.851655 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.351637627 +0000 UTC m=+144.586108525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.878179 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" podStartSLOduration=123.878162481 podStartE2EDuration="2m3.878162481s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.875792155 +0000 UTC m=+144.110263053" watchObservedRunningTime="2025-12-01 10:33:46.878162481 +0000 UTC m=+144.112633379" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.942091 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" podStartSLOduration=123.942075226 podStartE2EDuration="2m3.942075226s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.941228646 +0000 UTC m=+144.175699564" watchObservedRunningTime="2025-12-01 10:33:46.942075226 +0000 UTC m=+144.176546134" Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.953078 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:46 crc kubenswrapper[4909]: E1201 10:33:46.955434 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.455421015 +0000 UTC m=+144.689891913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:46 crc kubenswrapper[4909]: I1201 10:33:46.972704 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" podStartSLOduration=123.972688316 podStartE2EDuration="2m3.972688316s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.971100568 +0000 UTC m=+144.205571466" watchObservedRunningTime="2025-12-01 10:33:46.972688316 +0000 UTC m=+144.207159214" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.000540 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g7qpw" podStartSLOduration=125.000523566 podStartE2EDuration="2m5.000523566s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:46.99981607 +0000 UTC m=+144.234286968" watchObservedRunningTime="2025-12-01 10:33:47.000523566 +0000 UTC m=+144.234994474" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.055482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.055942 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.555920456 +0000 UTC m=+144.790391354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.074342 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nm8z2" podStartSLOduration=125.074322557 podStartE2EDuration="2m5.074322557s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:47.073808999 +0000 UTC m=+144.308279897" watchObservedRunningTime="2025-12-01 10:33:47.074322557 +0000 UTC m=+144.308793465" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.075128 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" podStartSLOduration=124.075104855 podStartE2EDuration="2m4.075104855s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:47.03125802 +0000 UTC m=+144.265728928" watchObservedRunningTime="2025-12-01 10:33:47.075104855 +0000 UTC m=+144.309575753" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.130967 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h7pgb" podStartSLOduration=125.130942581 podStartE2EDuration="2m5.130942581s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:47.127242818 +0000 UTC m=+144.361713726" watchObservedRunningTime="2025-12-01 10:33:47.130942581 +0000 UTC m=+144.365413509" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.157495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.158599 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.658577533 +0000 UTC m=+144.893048511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.163491 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" podStartSLOduration=124.16347302 podStartE2EDuration="2m4.16347302s" podCreationTimestamp="2025-12-01 10:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:47.163288963 +0000 UTC m=+144.397759881" watchObservedRunningTime="2025-12-01 10:33:47.16347302 +0000 UTC m=+144.397943918" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.259962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.260212 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.760191904 +0000 UTC m=+144.994662802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.260369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.260672 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.760665181 +0000 UTC m=+144.995136079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.361261 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.361618 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.861598607 +0000 UTC m=+145.096069505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.463233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.463970 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:47.963952953 +0000 UTC m=+145.198423851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.565267 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.565507 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.065471531 +0000 UTC m=+145.299942429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.565851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.585139 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.085114666 +0000 UTC m=+145.319585564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.667568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.668177 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.168155989 +0000 UTC m=+145.402626887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.678524 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:47 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:47 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:47 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.678617 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.686544 4909 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.732379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sbn7" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.769634 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.770000 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.269970207 +0000 UTC m=+145.504441105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.861225 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-455s8" event={"ID":"da6c8746-9c5d-4990-b66f-cae50ffbc83f","Type":"ContainerStarted","Data":"0024822afb3c7d96feb5abf158e8342925b592e1d10e7ce8ec0f7ad5356489fe"} Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.870017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.871045 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.371007867 +0000 UTC m=+145.605478765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.876570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5k4b" event={"ID":"fc72371b-fc8b-4050-afc6-deca5f398d3d","Type":"ContainerStarted","Data":"e20323b8ebea6b8cf2fb127f82d8dab7b9213759341ac18a47b13e54abb66d62"} Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.891849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w927p" event={"ID":"ddaf67bb-df33-49ef-b65e-e77eb630f5e5","Type":"ContainerStarted","Data":"a1fb308c15bf5fa45b0313da74f3cd206ccbf777645099f04c0597af1e96d679"} Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.898046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" event={"ID":"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342","Type":"ContainerStarted","Data":"4152bd0592298a7b2794dec903a3ba28eed9c4c433d507f3e91a4b18248d258b"} Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.898094 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" event={"ID":"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342","Type":"ContainerStarted","Data":"0bb061b5a09fafe40804d30dafb171743809c82a88d8485fa041df147f829568"} Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.901551 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7cd6g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.901607 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.916953 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-shvkd" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.917376 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mz2g7" Dec 01 10:33:47 crc kubenswrapper[4909]: I1201 10:33:47.973754 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:47 crc kubenswrapper[4909]: E1201 10:33:47.976237 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.476208105 +0000 UTC m=+145.710679193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.078396 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.078907 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.578890535 +0000 UTC m=+145.813361433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.180612 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.180968 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.680956521 +0000 UTC m=+145.915427419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.238373 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2pjzb" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.281592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.281798 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.781765312 +0000 UTC m=+146.016236220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.281991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.282264 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.78224968 +0000 UTC m=+146.016720578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.382944 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.383134 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.883110973 +0000 UTC m=+146.117581871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.383250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: E1201 10:33:48.383590 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:33:48.88358179 +0000 UTC m=+146.118052688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7mhb9" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.454779 4909 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T10:33:47.686574451Z","Handler":null,"Name":""} Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.480484 4909 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.480551 4909 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.484046 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.488302 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.534796 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.535780 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.537824 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.553798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.585456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjr6\" (UniqueName: \"kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.585855 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.585964 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.586018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.608235 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.608310 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.687379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjr6\" (UniqueName: \"kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.687453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.687492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.688269 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.689020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.689073 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:48 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:48 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:48 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.689105 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.717642 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.718747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.724759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjr6\" (UniqueName: \"kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6\") pod \"certified-operators-vrjdh\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.725707 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.738100 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.788519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.788620 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb8lv\" (UniqueName: \"kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.788647 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.799080 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7mhb9\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.883539 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.891006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb8lv\" (UniqueName: \"kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.891060 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.891160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.891747 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.891901 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.904544 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" event={"ID":"f8fe7db3-57ec-46e1-9cf8-ed1d429ec342","Type":"ContainerStarted","Data":"2dbd50f630b0e20ccae708b0b1b768adfc19f5f325e5ae9849d0c6e7326bf655"} Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.904668 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.915561 4909 generic.go:334] "Generic (PLEG): container finished" podID="d6190ef7-3deb-4bd9-9c73-109572e871d1" containerID="a71c7aa8238a55a6461527a2dd182f2c7475657eb0842df727c47fdf1e8313a8" exitCode=0 Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.916364 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" event={"ID":"d6190ef7-3deb-4bd9-9c73-109572e871d1","Type":"ContainerDied","Data":"a71c7aa8238a55a6461527a2dd182f2c7475657eb0842df727c47fdf1e8313a8"} Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.917930 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb8lv\" (UniqueName: \"kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv\") pod \"community-operators-2kbh7\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.925168 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-2n2sj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.925320 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2n2sj" podUID="987cc233-91d0-4ed1-8d93-62e90e5e0925" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.925356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.925512 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-2n2sj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.925581 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2n2sj" podUID="987cc233-91d0-4ed1-8d93-62e90e5e0925" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.943525 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.945053 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.948627 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ck6xv" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.949416 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dgdsb" podStartSLOduration=10.949397766 podStartE2EDuration="10.949397766s" podCreationTimestamp="2025-12-01 10:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:48.931814464 +0000 UTC m=+146.166285352" watchObservedRunningTime="2025-12-01 10:33:48.949397766 +0000 UTC m=+146.183868684" Dec 01 10:33:48 crc kubenswrapper[4909]: I1201 10:33:48.975005 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.002473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.002597 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.002725 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn5t\" (UniqueName: \"kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.050325 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.105392 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.105521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn5t\" (UniqueName: \"kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.105587 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.106036 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.106110 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.119714 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.121362 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.139951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn5t\" (UniqueName: \"kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t\") pod \"certified-operators-khzr6\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.144810 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.207418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.207489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2lxt\" (UniqueName: \"kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.207517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.208081 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:33:49 crc kubenswrapper[4909]: W1201 10:33:49.224788 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba162cc_ca36_4d6d_9034_7b3ad6f59179.slice/crio-d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6 WatchSource:0}: Error finding container d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6: Status 404 returned error can't find the container with id d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6 Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.266977 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.269790 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.310283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2lxt\" (UniqueName: \"kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.310486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.310688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.311362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.311619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.334931 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2lxt\" (UniqueName: \"kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt\") pod \"community-operators-zd7jf\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.366175 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.469538 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.480388 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:33:49 crc kubenswrapper[4909]: W1201 10:33:49.498452 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a9da3f_7bc4_42e3_ade7_b4728615a67b.slice/crio-64ec9807945c76fdf082c0bc405c4533733556fdf8561ec020faae6ac6d14da3 WatchSource:0}: Error finding container 64ec9807945c76fdf082c0bc405c4533733556fdf8561ec020faae6ac6d14da3: Status 404 returned error can't find the container with id 64ec9807945c76fdf082c0bc405c4533733556fdf8561ec020faae6ac6d14da3 Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.514308 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.628279 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.629350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.629891 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.630526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.630980 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.631072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.632392 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.632683 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.633516 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.633835 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.636225 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.636812 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.654410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.679308 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:49 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:49 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:49 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.679368 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.733954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.734011 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.766805 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:33:49 crc kubenswrapper[4909]: W1201 10:33:49.792390 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ef24d9_67de_4a08_82ff_092b66feb19f.slice/crio-d745c389827523892ebd936ccc09275ba6a213ca72e14384266934febecaf93d WatchSource:0}: Error finding container d745c389827523892ebd936ccc09275ba6a213ca72e14384266934febecaf93d: Status 404 returned error can't find the container with id d745c389827523892ebd936ccc09275ba6a213ca72e14384266934febecaf93d Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.834930 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.834986 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.835134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.856626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.900220 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.917522 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.929309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.929348 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerID="b7ff9618fe72179db1f2993a630a9a3737f036d5f4cb17843fdd56975a1a2c26" exitCode=0 Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.929791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerDied","Data":"b7ff9618fe72179db1f2993a630a9a3737f036d5f4cb17843fdd56975a1a2c26"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.929842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerStarted","Data":"1546be78afba9f432d86884c0779beaf03db9a580538a4ee687aba933825a669"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.935791 4909 generic.go:334] "Generic (PLEG): container finished" podID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerID="a56ee55a6b289a6e1b43296eb2b7e27ab8c8ba7629814bbdb9c998fd99923da2" exitCode=0 Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.935890 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerDied","Data":"a56ee55a6b289a6e1b43296eb2b7e27ab8c8ba7629814bbdb9c998fd99923da2"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.935927 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerStarted","Data":"64ec9807945c76fdf082c0bc405c4533733556fdf8561ec020faae6ac6d14da3"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.936282 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.937989 4909 generic.go:334] "Generic (PLEG): container finished" podID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerID="589c50e36ad19ed34c2680336b679d5251441acf337c2081c99a15de5e632233" exitCode=0 Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.938060 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerDied","Data":"589c50e36ad19ed34c2680336b679d5251441acf337c2081c99a15de5e632233"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.938091 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerStarted","Data":"d3547de5ac2ba26b2ea1e9a829643d3263482d34525b56fe94a2177d4d63201b"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.944645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerStarted","Data":"d745c389827523892ebd936ccc09275ba6a213ca72e14384266934febecaf93d"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.955136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" event={"ID":"4ba162cc-ca36-4d6d-9034-7b3ad6f59179","Type":"ContainerStarted","Data":"bf61c6461e409536da1c43bd644a1703408990dba3e62fbf7e649dc4517d9bd5"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.955198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" event={"ID":"4ba162cc-ca36-4d6d-9034-7b3ad6f59179","Type":"ContainerStarted","Data":"d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6"} Dec 01 10:33:49 crc kubenswrapper[4909]: I1201 10:33:49.955357 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.026655 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" podStartSLOduration=128.026582812 podStartE2EDuration="2m8.026582812s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:50.020587056 +0000 UTC m=+147.255057964" watchObservedRunningTime="2025-12-01 10:33:50.026582812 +0000 UTC m=+147.261053710" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.027188 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.149443 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.149939 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.160193 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.161758 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.167585 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.170159 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.195456 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.241603 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hkvz\" (UniqueName: \"kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz\") pod \"d6190ef7-3deb-4bd9-9c73-109572e871d1\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.241809 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume\") pod \"d6190ef7-3deb-4bd9-9c73-109572e871d1\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.241858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume\") pod \"d6190ef7-3deb-4bd9-9c73-109572e871d1\" (UID: \"d6190ef7-3deb-4bd9-9c73-109572e871d1\") " Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.244148 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6190ef7-3deb-4bd9-9c73-109572e871d1" (UID: "d6190ef7-3deb-4bd9-9c73-109572e871d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.250849 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz" (OuterVolumeSpecName: "kube-api-access-8hkvz") pod "d6190ef7-3deb-4bd9-9c73-109572e871d1" (UID: "d6190ef7-3deb-4bd9-9c73-109572e871d1"). InnerVolumeSpecName "kube-api-access-8hkvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.258142 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6190ef7-3deb-4bd9-9c73-109572e871d1" (UID: "d6190ef7-3deb-4bd9-9c73-109572e871d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.344116 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6190ef7-3deb-4bd9-9c73-109572e871d1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.344682 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6190ef7-3deb-4bd9-9c73-109572e871d1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.344704 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hkvz\" (UniqueName: \"kubernetes.io/projected/d6190ef7-3deb-4bd9-9c73-109572e871d1-kube-api-access-8hkvz\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.546676 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:33:50 crc kubenswrapper[4909]: W1201 10:33:50.549480 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd82be23d_d707_47c0_b3da_ad9c1dc7d304.slice/crio-bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a WatchSource:0}: Error finding container bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a: Status 404 returned error can't find the container with id bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.685281 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:50 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:50 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:50 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.685378 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.716327 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:33:50 crc kubenswrapper[4909]: E1201 10:33:50.716584 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6190ef7-3deb-4bd9-9c73-109572e871d1" containerName="collect-profiles" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.716598 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6190ef7-3deb-4bd9-9c73-109572e871d1" containerName="collect-profiles" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.716712 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6190ef7-3deb-4bd9-9c73-109572e871d1" containerName="collect-profiles" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.717644 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.719447 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.738058 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.750548 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44pg\" (UniqueName: \"kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.750635 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.750674 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.852492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44pg\" (UniqueName: \"kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.852570 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.852603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.853256 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.853268 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.876826 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44pg\" (UniqueName: \"kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg\") pod \"redhat-marketplace-bp5bb\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.968922 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e9035bbe44ff4273b3bdfbbd8cc8bad4d6ea671b897ea677301733dafb5f6e93"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.969006 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3893428091e6830fa7e8d23f410e92c684af8b7d9d2609157e9754781c56037"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.970552 4909 generic.go:334] "Generic (PLEG): container finished" podID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerID="c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9" exitCode=0 Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.970735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerDied","Data":"c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.972484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3898c19e068fc73deb85574169e3cc16f3736c52b7e5aab11f4f1275747d534c"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.972546 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2382bbaa13219abfaaee35c7e014674b3b2a6813ef98706f4b0bfa84905bf92a"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.972786 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.974394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d82be23d-d707-47c0-b3da-ad9c1dc7d304","Type":"ContainerStarted","Data":"bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.979227 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c6c5a952750f5cfba9ae7605b6062468f47a98b5fcac633bdc1ec2030ea400ea"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.979266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c3e681f72cdc2a9a0db19daa82dbcb7fe636bbc47364b759254eb2b2a3d84da2"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.983946 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.983973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h" event={"ID":"d6190ef7-3deb-4bd9-9c73-109572e871d1","Type":"ContainerDied","Data":"e2c03736b389ef96455059846303653cd2c26d7c9e2fbcf0d6d9f44796d90f37"} Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.984053 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c03736b389ef96455059846303653cd2c26d7c9e2fbcf0d6d9f44796d90f37" Dec 01 10:33:50 crc kubenswrapper[4909]: I1201 10:33:50.990333 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-m2qzl" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.001197 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptv4" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.111453 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.132849 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.143590 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.161207 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.161264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.161290 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rd96\" (UniqueName: \"kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.175645 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.264594 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.265024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.265063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rd96\" (UniqueName: \"kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.266220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.266442 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.294632 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rd96\" (UniqueName: \"kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96\") pod \"redhat-marketplace-hns5g\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.505234 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.645234 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.674198 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.678899 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:51 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:51 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:51 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.678965 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.717475 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.718565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.731298 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.733113 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.777779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.777867 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrz6s\" (UniqueName: \"kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.778050 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.839073 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.861594 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.861634 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.865060 4909 patch_prober.go:28] interesting pod/console-f9d7485db-wfgm2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.865103 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wfgm2" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.880716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrz6s\" (UniqueName: \"kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.880833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.880926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.881274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.882220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.911326 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrz6s\" (UniqueName: \"kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s\") pod \"redhat-operators-865bh\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.914555 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.916798 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.923048 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.923288 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.923336 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.982565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.982668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.992024 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d82be23d-d707-47c0-b3da-ad9c1dc7d304","Type":"ContainerStarted","Data":"d7a68e30fe87fc4fec50bbbf91dfe4f502782863876239355d942cdf6c194de2"} Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.993523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerStarted","Data":"fe28ca0da2d3b41845d4986a5e6fb8c510747490805498557f34f019f00827ac"} Dec 01 10:33:51 crc kubenswrapper[4909]: I1201 10:33:51.994401 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerStarted","Data":"31b4a882eaabc0f6d05bb7788194452ae234f2fa96ac792bc5b626dc26973ad1"} Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.008076 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.008058973 podStartE2EDuration="3.008058973s" podCreationTimestamp="2025-12-01 10:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:33:52.007321546 +0000 UTC m=+149.241792444" watchObservedRunningTime="2025-12-01 10:33:52.008058973 +0000 UTC m=+149.242529871" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.084814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.085681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.086041 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.109416 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.126952 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.128106 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.128844 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.146031 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.186829 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwbj\" (UniqueName: \"kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.187091 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.187173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.254916 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.288852 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwbj\" (UniqueName: \"kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.289491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.289610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.290505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.294598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.307163 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwbj\" (UniqueName: \"kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj\") pod \"redhat-operators-5q9m7\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.330750 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.344161 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:33:52 crc kubenswrapper[4909]: W1201 10:33:52.348634 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6f8e71_e11d_47d5_a8a2_c42e455f3f65.slice/crio-230be9764671fab59f0269309a2f45293a973d49f740ec2f2c6384ae79b5e525 WatchSource:0}: Error finding container 230be9764671fab59f0269309a2f45293a973d49f740ec2f2c6384ae79b5e525: Status 404 returned error can't find the container with id 230be9764671fab59f0269309a2f45293a973d49f740ec2f2c6384ae79b5e525 Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.503232 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.547331 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.678486 4909 patch_prober.go:28] interesting pod/router-default-5444994796-2q24w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:33:52 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Dec 01 10:33:52 crc kubenswrapper[4909]: [+]process-running ok Dec 01 10:33:52 crc kubenswrapper[4909]: healthz check failed Dec 01 10:33:52 crc kubenswrapper[4909]: I1201 10:33:52.678545 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2q24w" podUID="15547104-3163-44a2-9b36-4f4d0f3cde37" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.002988 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerID="7042ab590f3bb60447c517e6021fc7c3dcc55d6598bd4b95ba2734c012a75988" exitCode=0 Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.003102 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerDied","Data":"7042ab590f3bb60447c517e6021fc7c3dcc55d6598bd4b95ba2734c012a75988"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.005779 4909 generic.go:334] "Generic (PLEG): container finished" podID="8949a76d-2e07-49ff-888a-0ca883b56996" containerID="60d56e63d7752827222b023745357166e9aaf6e00e594da26dd5962459acac97" exitCode=0 Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.005839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerDied","Data":"60d56e63d7752827222b023745357166e9aaf6e00e594da26dd5962459acac97"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.007424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerStarted","Data":"19e60ec182532e91b8d803915b723bd78022158fbaeff0c4a38897ec2523d0d1"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.011299 4909 generic.go:334] "Generic (PLEG): container finished" podID="d82be23d-d707-47c0-b3da-ad9c1dc7d304" containerID="d7a68e30fe87fc4fec50bbbf91dfe4f502782863876239355d942cdf6c194de2" exitCode=0 Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.011350 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d82be23d-d707-47c0-b3da-ad9c1dc7d304","Type":"ContainerDied","Data":"d7a68e30fe87fc4fec50bbbf91dfe4f502782863876239355d942cdf6c194de2"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.012559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerStarted","Data":"230be9764671fab59f0269309a2f45293a973d49f740ec2f2c6384ae79b5e525"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.015106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2d90f458-a780-449a-8760-c8311aa1204b","Type":"ContainerStarted","Data":"59eec55b1f987519ec016a4c5202a495bf1de62bd764fe064da6d8a38c0704ef"} Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.678601 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:53 crc kubenswrapper[4909]: I1201 10:33:53.683456 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2q24w" Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.033217 4909 generic.go:334] "Generic (PLEG): container finished" podID="948b91e0-ecf3-4837-8389-4777dae9d246" containerID="97b0c352a4577d4cf098c65994afa513a05696c212a4ed902c6c536c7f04536c" exitCode=0 Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.033331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerDied","Data":"97b0c352a4577d4cf098c65994afa513a05696c212a4ed902c6c536c7f04536c"} Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.035984 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerID="ce53868867776b25aa665913ff7c7c2ed3819b541622234d73160d98b3a0f739" exitCode=0 Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.036057 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerDied","Data":"ce53868867776b25aa665913ff7c7c2ed3819b541622234d73160d98b3a0f739"} Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.038460 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d90f458-a780-449a-8760-c8311aa1204b" containerID="7b52dd01deda74bed509e81f5b65fdbcbf57cc80b8dccb95e91ef46e46aea9ae" exitCode=0 Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.038519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2d90f458-a780-449a-8760-c8311aa1204b","Type":"ContainerDied","Data":"7b52dd01deda74bed509e81f5b65fdbcbf57cc80b8dccb95e91ef46e46aea9ae"} Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.331742 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.472964 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir\") pod \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.473032 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access\") pod \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\" (UID: \"d82be23d-d707-47c0-b3da-ad9c1dc7d304\") " Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.474023 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d82be23d-d707-47c0-b3da-ad9c1dc7d304" (UID: "d82be23d-d707-47c0-b3da-ad9c1dc7d304"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.491817 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d82be23d-d707-47c0-b3da-ad9c1dc7d304" (UID: "d82be23d-d707-47c0-b3da-ad9c1dc7d304"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.573919 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:54 crc kubenswrapper[4909]: I1201 10:33:54.574030 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d82be23d-d707-47c0-b3da-ad9c1dc7d304-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.093433 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.093660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d82be23d-d707-47c0-b3da-ad9c1dc7d304","Type":"ContainerDied","Data":"bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a"} Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.094208 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5af652fca318cc10a39fac2effb802c67a990b24ef2866f9caeed4358f138a" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.462498 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.607042 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir\") pod \"2d90f458-a780-449a-8760-c8311aa1204b\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.607166 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2d90f458-a780-449a-8760-c8311aa1204b" (UID: "2d90f458-a780-449a-8760-c8311aa1204b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.607183 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access\") pod \"2d90f458-a780-449a-8760-c8311aa1204b\" (UID: \"2d90f458-a780-449a-8760-c8311aa1204b\") " Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.607635 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d90f458-a780-449a-8760-c8311aa1204b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.646656 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2d90f458-a780-449a-8760-c8311aa1204b" (UID: "2d90f458-a780-449a-8760-c8311aa1204b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:55 crc kubenswrapper[4909]: I1201 10:33:55.708952 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d90f458-a780-449a-8760-c8311aa1204b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:56 crc kubenswrapper[4909]: I1201 10:33:56.109365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2d90f458-a780-449a-8760-c8311aa1204b","Type":"ContainerDied","Data":"59eec55b1f987519ec016a4c5202a495bf1de62bd764fe064da6d8a38c0704ef"} Dec 01 10:33:56 crc kubenswrapper[4909]: I1201 10:33:56.109421 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59eec55b1f987519ec016a4c5202a495bf1de62bd764fe064da6d8a38c0704ef" Dec 01 10:33:56 crc kubenswrapper[4909]: I1201 10:33:56.109455 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:33:57 crc kubenswrapper[4909]: I1201 10:33:57.044849 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k5k4b" Dec 01 10:33:58 crc kubenswrapper[4909]: I1201 10:33:58.938587 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2n2sj" Dec 01 10:34:01 crc kubenswrapper[4909]: I1201 10:34:01.869644 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:34:01 crc kubenswrapper[4909]: I1201 10:34:01.873616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:34:05 crc kubenswrapper[4909]: I1201 10:34:05.392539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:34:05 crc kubenswrapper[4909]: I1201 10:34:05.404469 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca0394a-c980-4220-ab44-d2f55519cb1a-metrics-certs\") pod \"network-metrics-daemon-z48j9\" (UID: \"dca0394a-c980-4220-ab44-d2f55519cb1a\") " pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:34:05 crc kubenswrapper[4909]: I1201 10:34:05.480891 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z48j9" Dec 01 10:34:06 crc kubenswrapper[4909]: I1201 10:34:06.194389 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:34:06 crc kubenswrapper[4909]: I1201 10:34:06.194463 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:34:08 crc kubenswrapper[4909]: I1201 10:34:08.911571 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.333612 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z48j9"] Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.473604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z48j9" event={"ID":"dca0394a-c980-4220-ab44-d2f55519cb1a","Type":"ContainerStarted","Data":"7e05f6aac40a14d2a90a4b88300d1fefc28ab37384b4a459976a66515de4afad"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.477025 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerStarted","Data":"e29a43058885ac2d3770345b75d1136be54485ef4c077e6926d4236e598df99c"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.479103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerStarted","Data":"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.482404 4909 generic.go:334] "Generic (PLEG): container finished" podID="8949a76d-2e07-49ff-888a-0ca883b56996" containerID="f4573ad3c94a9613a717226e7b597ef72b32432147e51f02a006281cd20fbd50" exitCode=0 Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.482497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerDied","Data":"f4573ad3c94a9613a717226e7b597ef72b32432147e51f02a006281cd20fbd50"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.488410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerStarted","Data":"59e237be32ad828e5870edd1316fefe38bd3c2ef0f1a454bfe2db72a3afb8b05"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.491490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerStarted","Data":"c89565443eaea28670f3e32a2f034ed1abfc81f754d451c0324fd1a0faa6a195"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.504003 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerID="158f41184e5bacc64e1f2de0266de54d4d408ac19172aa3d5710369538e18a09" exitCode=0 Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.504069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerDied","Data":"158f41184e5bacc64e1f2de0266de54d4d408ac19172aa3d5710369538e18a09"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.521387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerStarted","Data":"3c535fb969561ef6766ab6e6272e05ea6e7d4be37c33557cea159b6408cbccc3"} Dec 01 10:34:21 crc kubenswrapper[4909]: I1201 10:34:21.538068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerStarted","Data":"aec14c1b01fb03dacfbc18d7de403c3967e48a21a5ea46b9ca748facc5d5dbdf"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.061568 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-67lbv" Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.545107 4909 generic.go:334] "Generic (PLEG): container finished" podID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerID="27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.545168 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerDied","Data":"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.546974 4909 generic.go:334] "Generic (PLEG): container finished" podID="948b91e0-ecf3-4837-8389-4777dae9d246" containerID="59e237be32ad828e5870edd1316fefe38bd3c2ef0f1a454bfe2db72a3afb8b05" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.547034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerDied","Data":"59e237be32ad828e5870edd1316fefe38bd3c2ef0f1a454bfe2db72a3afb8b05"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.550727 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerID="3c535fb969561ef6766ab6e6272e05ea6e7d4be37c33557cea159b6408cbccc3" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.550782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerDied","Data":"3c535fb969561ef6766ab6e6272e05ea6e7d4be37c33557cea159b6408cbccc3"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.552557 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerID="c89565443eaea28670f3e32a2f034ed1abfc81f754d451c0324fd1a0faa6a195" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.552625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerDied","Data":"c89565443eaea28670f3e32a2f034ed1abfc81f754d451c0324fd1a0faa6a195"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.554265 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z48j9" event={"ID":"dca0394a-c980-4220-ab44-d2f55519cb1a","Type":"ContainerStarted","Data":"586575ed3cd29c216e1f9971741fb1c9522dd7034b93f63b1e251981a766c272"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.554306 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z48j9" event={"ID":"dca0394a-c980-4220-ab44-d2f55519cb1a","Type":"ContainerStarted","Data":"eb2ed243fef3e0f2eb5fbb6116fed1d1acb0a358c710542c77aba6550f141138"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.556275 4909 generic.go:334] "Generic (PLEG): container finished" podID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerID="aec14c1b01fb03dacfbc18d7de403c3967e48a21a5ea46b9ca748facc5d5dbdf" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.556305 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerDied","Data":"aec14c1b01fb03dacfbc18d7de403c3967e48a21a5ea46b9ca748facc5d5dbdf"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.558068 4909 generic.go:334] "Generic (PLEG): container finished" podID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerID="e29a43058885ac2d3770345b75d1136be54485ef4c077e6926d4236e598df99c" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.558092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerDied","Data":"e29a43058885ac2d3770345b75d1136be54485ef4c077e6926d4236e598df99c"} Dec 01 10:34:22 crc kubenswrapper[4909]: I1201 10:34:22.681010 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z48j9" podStartSLOduration=160.68099049 podStartE2EDuration="2m40.68099049s" podCreationTimestamp="2025-12-01 10:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:22.663075726 +0000 UTC m=+179.897546624" watchObservedRunningTime="2025-12-01 10:34:22.68099049 +0000 UTC m=+179.915461388" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.577633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerStarted","Data":"1af2589a318e95561f70157a2d3a5202516a4ad8710d6a145c3be18d692ed614"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.584674 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerStarted","Data":"a226a4ee004346ecbcd004a6778be4a1a556fb9bfab3dbc1d3219dde0e1b2875"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.597038 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerStarted","Data":"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.599642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerStarted","Data":"6f24187acb3ed4986040c49c45a832a6bd65adcdccd83b4f759d4741cf8a7d3f"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.601414 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrjdh" podStartSLOduration=2.391385507 podStartE2EDuration="36.601393588s" podCreationTimestamp="2025-12-01 10:33:48 +0000 UTC" firstStartedPulling="2025-12-01 10:33:49.943213557 +0000 UTC m=+147.177684445" lastFinishedPulling="2025-12-01 10:34:24.153221618 +0000 UTC m=+181.387692526" observedRunningTime="2025-12-01 10:34:24.599354195 +0000 UTC m=+181.833825123" watchObservedRunningTime="2025-12-01 10:34:24.601393588 +0000 UTC m=+181.835864486" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.604678 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerStarted","Data":"e3b2c44c11b8705e591d3aea0b2f6e4531d96d8017fda3c3df517eca8539aaae"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.607059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerStarted","Data":"867813fd28542dc7c779d6286773dac2e3969e3599c65f731ce4548c2f64b20c"} Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.622966 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zd7jf" podStartSLOduration=2.246712593 podStartE2EDuration="35.622948232s" podCreationTimestamp="2025-12-01 10:33:49 +0000 UTC" firstStartedPulling="2025-12-01 10:33:50.971916731 +0000 UTC m=+148.206387629" lastFinishedPulling="2025-12-01 10:34:24.34815237 +0000 UTC m=+181.582623268" observedRunningTime="2025-12-01 10:34:24.619422756 +0000 UTC m=+181.853893654" watchObservedRunningTime="2025-12-01 10:34:24.622948232 +0000 UTC m=+181.857419130" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.641268 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hns5g" podStartSLOduration=2.700257143 podStartE2EDuration="33.6412508s" podCreationTimestamp="2025-12-01 10:33:51 +0000 UTC" firstStartedPulling="2025-12-01 10:33:53.005464713 +0000 UTC m=+150.239935611" lastFinishedPulling="2025-12-01 10:34:23.94645837 +0000 UTC m=+181.180929268" observedRunningTime="2025-12-01 10:34:24.638102126 +0000 UTC m=+181.872573034" watchObservedRunningTime="2025-12-01 10:34:24.6412508 +0000 UTC m=+181.875721698" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.689453 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5q9m7" podStartSLOduration=2.524228232 podStartE2EDuration="32.68942517s" podCreationTimestamp="2025-12-01 10:33:52 +0000 UTC" firstStartedPulling="2025-12-01 10:33:54.035028708 +0000 UTC m=+151.269499606" lastFinishedPulling="2025-12-01 10:34:24.200225646 +0000 UTC m=+181.434696544" observedRunningTime="2025-12-01 10:34:24.688169305 +0000 UTC m=+181.922640203" watchObservedRunningTime="2025-12-01 10:34:24.68942517 +0000 UTC m=+181.923896078" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.693481 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bp5bb" podStartSLOduration=4.472546834 podStartE2EDuration="34.693467125s" podCreationTimestamp="2025-12-01 10:33:50 +0000 UTC" firstStartedPulling="2025-12-01 10:33:53.007176904 +0000 UTC m=+150.241647802" lastFinishedPulling="2025-12-01 10:34:23.228097195 +0000 UTC m=+180.462568093" observedRunningTime="2025-12-01 10:34:24.662564935 +0000 UTC m=+181.897035843" watchObservedRunningTime="2025-12-01 10:34:24.693467125 +0000 UTC m=+181.927938033" Dec 01 10:34:24 crc kubenswrapper[4909]: I1201 10:34:24.722328 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-865bh" podStartSLOduration=3.549134858 podStartE2EDuration="33.722298461s" podCreationTimestamp="2025-12-01 10:33:51 +0000 UTC" firstStartedPulling="2025-12-01 10:33:54.036826414 +0000 UTC m=+151.271297312" lastFinishedPulling="2025-12-01 10:34:24.209990017 +0000 UTC m=+181.444460915" observedRunningTime="2025-12-01 10:34:24.717853861 +0000 UTC m=+181.952324779" watchObservedRunningTime="2025-12-01 10:34:24.722298461 +0000 UTC m=+181.956769359" Dec 01 10:34:25 crc kubenswrapper[4909]: I1201 10:34:25.627896 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerStarted","Data":"dfb49a5e2014ae6d25046c0c3cdcd5f4014df68e91cef2db9685ffd56840145a"} Dec 01 10:34:25 crc kubenswrapper[4909]: I1201 10:34:25.630452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerStarted","Data":"5da1dfe11ad709a69b6a7b3fefa864235e1dbc0dc2b3d3e85152861d19284ff0"} Dec 01 10:34:25 crc kubenswrapper[4909]: I1201 10:34:25.652168 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2kbh7" podStartSLOduration=2.903312657 podStartE2EDuration="37.652141174s" podCreationTimestamp="2025-12-01 10:33:48 +0000 UTC" firstStartedPulling="2025-12-01 10:33:49.942818273 +0000 UTC m=+147.177289171" lastFinishedPulling="2025-12-01 10:34:24.69164679 +0000 UTC m=+181.926117688" observedRunningTime="2025-12-01 10:34:25.65201446 +0000 UTC m=+182.886485368" watchObservedRunningTime="2025-12-01 10:34:25.652141174 +0000 UTC m=+182.886612072" Dec 01 10:34:25 crc kubenswrapper[4909]: I1201 10:34:25.677198 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-khzr6" podStartSLOduration=3.182663033 podStartE2EDuration="37.677176034s" podCreationTimestamp="2025-12-01 10:33:48 +0000 UTC" firstStartedPulling="2025-12-01 10:33:49.935981237 +0000 UTC m=+147.170452135" lastFinishedPulling="2025-12-01 10:34:24.430494248 +0000 UTC m=+181.664965136" observedRunningTime="2025-12-01 10:34:25.674699265 +0000 UTC m=+182.909170183" watchObservedRunningTime="2025-12-01 10:34:25.677176034 +0000 UTC m=+182.911646942" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.411806 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:26 crc kubenswrapper[4909]: E1201 10:34:26.412140 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d90f458-a780-449a-8760-c8311aa1204b" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.412156 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d90f458-a780-449a-8760-c8311aa1204b" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: E1201 10:34:26.412176 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be23d-d707-47c0-b3da-ad9c1dc7d304" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.412185 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be23d-d707-47c0-b3da-ad9c1dc7d304" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.412294 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d90f458-a780-449a-8760-c8311aa1204b" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.412311 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82be23d-d707-47c0-b3da-ad9c1dc7d304" containerName="pruner" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.412772 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.416153 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.425113 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.426173 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.504491 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.504570 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.606303 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.606378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.606513 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.627345 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:26 crc kubenswrapper[4909]: I1201 10:34:26.730957 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:27 crc kubenswrapper[4909]: I1201 10:34:27.165814 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:34:27 crc kubenswrapper[4909]: I1201 10:34:27.643419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8e0947c4-09ce-48d5-a441-67da0750c76e","Type":"ContainerStarted","Data":"e9e8ad711d610799af1b23c129bef2f1e93183a72cc2da8a86cc9c73f04b201c"} Dec 01 10:34:28 crc kubenswrapper[4909]: I1201 10:34:28.657017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8e0947c4-09ce-48d5-a441-67da0750c76e","Type":"ContainerStarted","Data":"7cc31223901457c86e8c2fa08b7c32741f1f312260737e0a47e8343725bf4cc2"} Dec 01 10:34:28 crc kubenswrapper[4909]: I1201 10:34:28.675131 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.675110699 podStartE2EDuration="2.675110699s" podCreationTimestamp="2025-12-01 10:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:28.675072138 +0000 UTC m=+185.909543056" watchObservedRunningTime="2025-12-01 10:34:28.675110699 +0000 UTC m=+185.909581607" Dec 01 10:34:28 crc kubenswrapper[4909]: I1201 10:34:28.884644 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:34:28 crc kubenswrapper[4909]: I1201 10:34:28.884742 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.053077 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.053722 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.273730 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.273770 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.470782 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.470889 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.738444 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.742070 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.750139 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.751074 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.789406 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.827044 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.844280 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.863416 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:29 crc kubenswrapper[4909]: I1201 10:34:29.926082 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:34:30 crc kubenswrapper[4909]: I1201 10:34:30.668115 4909 generic.go:334] "Generic (PLEG): container finished" podID="8e0947c4-09ce-48d5-a441-67da0750c76e" containerID="7cc31223901457c86e8c2fa08b7c32741f1f312260737e0a47e8343725bf4cc2" exitCode=0 Dec 01 10:34:30 crc kubenswrapper[4909]: I1201 10:34:30.668232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8e0947c4-09ce-48d5-a441-67da0750c76e","Type":"ContainerDied","Data":"7cc31223901457c86e8c2fa08b7c32741f1f312260737e0a47e8343725bf4cc2"} Dec 01 10:34:30 crc kubenswrapper[4909]: I1201 10:34:30.730525 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.112452 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.113715 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.151757 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.505845 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.505919 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.543545 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.720201 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:34:31 crc kubenswrapper[4909]: I1201 10:34:31.721120 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.065103 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.095587 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access\") pod \"8e0947c4-09ce-48d5-a441-67da0750c76e\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.095693 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir\") pod \"8e0947c4-09ce-48d5-a441-67da0750c76e\" (UID: \"8e0947c4-09ce-48d5-a441-67da0750c76e\") " Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.095840 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e0947c4-09ce-48d5-a441-67da0750c76e" (UID: "8e0947c4-09ce-48d5-a441-67da0750c76e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.096253 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e0947c4-09ce-48d5-a441-67da0750c76e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.107114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e0947c4-09ce-48d5-a441-67da0750c76e" (UID: "8e0947c4-09ce-48d5-a441-67da0750c76e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.129651 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.129697 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.134004 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.134248 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zd7jf" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="registry-server" containerID="cri-o://03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35" gracePeriod=2 Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.170900 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.197086 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e0947c4-09ce-48d5-a441-67da0750c76e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.207930 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:32 crc kubenswrapper[4909]: E1201 10:34:32.208219 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0947c4-09ce-48d5-a441-67da0750c76e" containerName="pruner" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.208234 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0947c4-09ce-48d5-a441-67da0750c76e" containerName="pruner" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.208368 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0947c4-09ce-48d5-a441-67da0750c76e" containerName="pruner" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.208821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.224362 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.298979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.299137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.299170 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.331129 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.331176 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.373132 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.400461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.400595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.401009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.401530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.401130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.679461 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.679455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8e0947c4-09ce-48d5-a441-67da0750c76e","Type":"ContainerDied","Data":"e9e8ad711d610799af1b23c129bef2f1e93183a72cc2da8a86cc9c73f04b201c"} Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.679530 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e8ad711d610799af1b23c129bef2f1e93183a72cc2da8a86cc9c73f04b201c" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.717045 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.717590 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.734396 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:34:32 crc kubenswrapper[4909]: I1201 10:34:32.734653 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-khzr6" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="registry-server" containerID="cri-o://5da1dfe11ad709a69b6a7b3fefa864235e1dbc0dc2b3d3e85152861d19284ff0" gracePeriod=2 Dec 01 10:34:33 crc kubenswrapper[4909]: I1201 10:34:33.950670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access\") pod \"installer-9-crc\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.028330 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:34:34 crc kubenswrapper[4909]: W1201 10:34:34.451678 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode0b055ce_6813_4bda_80c6_e71788d05982.slice/crio-7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d WatchSource:0}: Error finding container 7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d: Status 404 returned error can't find the container with id 7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.453689 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.458804 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.535596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content\") pod \"e3ef24d9-67de-4a08-82ff-092b66feb19f\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.536063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2lxt\" (UniqueName: \"kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt\") pod \"e3ef24d9-67de-4a08-82ff-092b66feb19f\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.536281 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities\") pod \"e3ef24d9-67de-4a08-82ff-092b66feb19f\" (UID: \"e3ef24d9-67de-4a08-82ff-092b66feb19f\") " Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.536076 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.536942 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hns5g" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="registry-server" containerID="cri-o://a226a4ee004346ecbcd004a6778be4a1a556fb9bfab3dbc1d3219dde0e1b2875" gracePeriod=2 Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.537790 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities" (OuterVolumeSpecName: "utilities") pod "e3ef24d9-67de-4a08-82ff-092b66feb19f" (UID: "e3ef24d9-67de-4a08-82ff-092b66feb19f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.541621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt" (OuterVolumeSpecName: "kube-api-access-k2lxt") pod "e3ef24d9-67de-4a08-82ff-092b66feb19f" (UID: "e3ef24d9-67de-4a08-82ff-092b66feb19f"). InnerVolumeSpecName "kube-api-access-k2lxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.605943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3ef24d9-67de-4a08-82ff-092b66feb19f" (UID: "e3ef24d9-67de-4a08-82ff-092b66feb19f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.639306 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.639378 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2lxt\" (UniqueName: \"kubernetes.io/projected/e3ef24d9-67de-4a08-82ff-092b66feb19f-kube-api-access-k2lxt\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.639398 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ef24d9-67de-4a08-82ff-092b66feb19f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.694338 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerID="5da1dfe11ad709a69b6a7b3fefa864235e1dbc0dc2b3d3e85152861d19284ff0" exitCode=0 Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.694394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerDied","Data":"5da1dfe11ad709a69b6a7b3fefa864235e1dbc0dc2b3d3e85152861d19284ff0"} Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.695617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e0b055ce-6813-4bda-80c6-e71788d05982","Type":"ContainerStarted","Data":"7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d"} Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.699376 4909 generic.go:334] "Generic (PLEG): container finished" podID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerID="03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35" exitCode=0 Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.699461 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerDied","Data":"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35"} Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.699504 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zd7jf" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.699532 4909 scope.go:117] "RemoveContainer" containerID="03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.699514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zd7jf" event={"ID":"e3ef24d9-67de-4a08-82ff-092b66feb19f","Type":"ContainerDied","Data":"d745c389827523892ebd936ccc09275ba6a213ca72e14384266934febecaf93d"} Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.731765 4909 scope.go:117] "RemoveContainer" containerID="27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.733250 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.736497 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zd7jf"] Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.831490 4909 scope.go:117] "RemoveContainer" containerID="c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.847522 4909 scope.go:117] "RemoveContainer" containerID="03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35" Dec 01 10:34:34 crc kubenswrapper[4909]: E1201 10:34:34.848095 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35\": container with ID starting with 03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35 not found: ID does not exist" containerID="03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.848146 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35"} err="failed to get container status \"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35\": rpc error: code = NotFound desc = could not find container \"03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35\": container with ID starting with 03b459b1ec57f1590e7a78d837638d427b287f610647303cbd6bbed062126c35 not found: ID does not exist" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.848221 4909 scope.go:117] "RemoveContainer" containerID="27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa" Dec 01 10:34:34 crc kubenswrapper[4909]: E1201 10:34:34.848696 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa\": container with ID starting with 27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa not found: ID does not exist" containerID="27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.848795 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa"} err="failed to get container status \"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa\": rpc error: code = NotFound desc = could not find container \"27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa\": container with ID starting with 27d2ce814c451a17954b7cfbdc625f2077a0aa655d4e45f089c39c9efb696cfa not found: ID does not exist" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.848896 4909 scope.go:117] "RemoveContainer" containerID="c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9" Dec 01 10:34:34 crc kubenswrapper[4909]: E1201 10:34:34.849290 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9\": container with ID starting with c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9 not found: ID does not exist" containerID="c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9" Dec 01 10:34:34 crc kubenswrapper[4909]: I1201 10:34:34.849376 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9"} err="failed to get container status \"c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9\": rpc error: code = NotFound desc = could not find container \"c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9\": container with ID starting with c053df2dff072ecba718c484ee1d2662ef042f1e1bc4795724358e26f214f5c9 not found: ID does not exist" Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.266671 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" path="/var/lib/kubelet/pods/e3ef24d9-67de-4a08-82ff-092b66feb19f/volumes" Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.704953 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e0b055ce-6813-4bda-80c6-e71788d05982","Type":"ContainerStarted","Data":"73bb607e04773da09307439428d10e438844a5832253823382b19894c3b47bb2"} Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.706669 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerID="a226a4ee004346ecbcd004a6778be4a1a556fb9bfab3dbc1d3219dde0e1b2875" exitCode=0 Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.706714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerDied","Data":"a226a4ee004346ecbcd004a6778be4a1a556fb9bfab3dbc1d3219dde0e1b2875"} Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.906434 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.961820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content\") pod \"6bfb46e1-7822-4508-ac16-d71780bcfe30\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.962358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxn5t\" (UniqueName: \"kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t\") pod \"6bfb46e1-7822-4508-ac16-d71780bcfe30\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.962410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities\") pod \"6bfb46e1-7822-4508-ac16-d71780bcfe30\" (UID: \"6bfb46e1-7822-4508-ac16-d71780bcfe30\") " Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.963289 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities" (OuterVolumeSpecName: "utilities") pod "6bfb46e1-7822-4508-ac16-d71780bcfe30" (UID: "6bfb46e1-7822-4508-ac16-d71780bcfe30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:35 crc kubenswrapper[4909]: I1201 10:34:35.970933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t" (OuterVolumeSpecName: "kube-api-access-rxn5t") pod "6bfb46e1-7822-4508-ac16-d71780bcfe30" (UID: "6bfb46e1-7822-4508-ac16-d71780bcfe30"). InnerVolumeSpecName "kube-api-access-rxn5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.013854 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bfb46e1-7822-4508-ac16-d71780bcfe30" (UID: "6bfb46e1-7822-4508-ac16-d71780bcfe30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.065863 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxn5t\" (UniqueName: \"kubernetes.io/projected/6bfb46e1-7822-4508-ac16-d71780bcfe30-kube-api-access-rxn5t\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.065922 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.065939 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bfb46e1-7822-4508-ac16-d71780bcfe30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.194259 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.194324 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.234441 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.269053 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities\") pod \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.269136 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content\") pod \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.269191 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rd96\" (UniqueName: \"kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96\") pod \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\" (UID: \"9e5f070e-cd41-4dd5-93ad-0e73b96d3858\") " Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.270807 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities" (OuterVolumeSpecName: "utilities") pod "9e5f070e-cd41-4dd5-93ad-0e73b96d3858" (UID: "9e5f070e-cd41-4dd5-93ad-0e73b96d3858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.275345 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96" (OuterVolumeSpecName: "kube-api-access-6rd96") pod "9e5f070e-cd41-4dd5-93ad-0e73b96d3858" (UID: "9e5f070e-cd41-4dd5-93ad-0e73b96d3858"). InnerVolumeSpecName "kube-api-access-6rd96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.289521 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e5f070e-cd41-4dd5-93ad-0e73b96d3858" (UID: "9e5f070e-cd41-4dd5-93ad-0e73b96d3858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.371356 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.371705 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.371805 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rd96\" (UniqueName: \"kubernetes.io/projected/9e5f070e-cd41-4dd5-93ad-0e73b96d3858-kube-api-access-6rd96\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.714230 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khzr6" event={"ID":"6bfb46e1-7822-4508-ac16-d71780bcfe30","Type":"ContainerDied","Data":"1546be78afba9f432d86884c0779beaf03db9a580538a4ee687aba933825a669"} Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.714283 4909 scope.go:117] "RemoveContainer" containerID="5da1dfe11ad709a69b6a7b3fefa864235e1dbc0dc2b3d3e85152861d19284ff0" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.714437 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khzr6" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.718975 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hns5g" event={"ID":"9e5f070e-cd41-4dd5-93ad-0e73b96d3858","Type":"ContainerDied","Data":"fe28ca0da2d3b41845d4986a5e6fb8c510747490805498557f34f019f00827ac"} Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.719031 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hns5g" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.732967 4909 scope.go:117] "RemoveContainer" containerID="c89565443eaea28670f3e32a2f034ed1abfc81f754d451c0324fd1a0faa6a195" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.740607 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.740589549 podStartE2EDuration="4.740589549s" podCreationTimestamp="2025-12-01 10:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:36.738514155 +0000 UTC m=+193.972985073" watchObservedRunningTime="2025-12-01 10:34:36.740589549 +0000 UTC m=+193.975060447" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.755551 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.764054 4909 scope.go:117] "RemoveContainer" containerID="b7ff9618fe72179db1f2993a630a9a3737f036d5f4cb17843fdd56975a1a2c26" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.769940 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-khzr6"] Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.772664 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.776182 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hns5g"] Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.792342 4909 scope.go:117] "RemoveContainer" containerID="a226a4ee004346ecbcd004a6778be4a1a556fb9bfab3dbc1d3219dde0e1b2875" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.812214 4909 scope.go:117] "RemoveContainer" containerID="158f41184e5bacc64e1f2de0266de54d4d408ac19172aa3d5710369538e18a09" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.829327 4909 scope.go:117] "RemoveContainer" containerID="7042ab590f3bb60447c517e6021fc7c3dcc55d6598bd4b95ba2734c012a75988" Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.931819 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:34:36 crc kubenswrapper[4909]: I1201 10:34:36.932106 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5q9m7" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="registry-server" containerID="cri-o://e3b2c44c11b8705e591d3aea0b2f6e4531d96d8017fda3c3df517eca8539aaae" gracePeriod=2 Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.264279 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" path="/var/lib/kubelet/pods/6bfb46e1-7822-4508-ac16-d71780bcfe30/volumes" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.265232 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" path="/var/lib/kubelet/pods/9e5f070e-cd41-4dd5-93ad-0e73b96d3858/volumes" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.733336 4909 generic.go:334] "Generic (PLEG): container finished" podID="948b91e0-ecf3-4837-8389-4777dae9d246" containerID="e3b2c44c11b8705e591d3aea0b2f6e4531d96d8017fda3c3df517eca8539aaae" exitCode=0 Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.733386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerDied","Data":"e3b2c44c11b8705e591d3aea0b2f6e4531d96d8017fda3c3df517eca8539aaae"} Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.791712 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.888470 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities\") pod \"948b91e0-ecf3-4837-8389-4777dae9d246\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.888569 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwbj\" (UniqueName: \"kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj\") pod \"948b91e0-ecf3-4837-8389-4777dae9d246\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.888615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content\") pod \"948b91e0-ecf3-4837-8389-4777dae9d246\" (UID: \"948b91e0-ecf3-4837-8389-4777dae9d246\") " Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.889455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities" (OuterVolumeSpecName: "utilities") pod "948b91e0-ecf3-4837-8389-4777dae9d246" (UID: "948b91e0-ecf3-4837-8389-4777dae9d246"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.895155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj" (OuterVolumeSpecName: "kube-api-access-hhwbj") pod "948b91e0-ecf3-4837-8389-4777dae9d246" (UID: "948b91e0-ecf3-4837-8389-4777dae9d246"). InnerVolumeSpecName "kube-api-access-hhwbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.990571 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:37 crc kubenswrapper[4909]: I1201 10:34:37.990611 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwbj\" (UniqueName: \"kubernetes.io/projected/948b91e0-ecf3-4837-8389-4777dae9d246-kube-api-access-hhwbj\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.009757 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "948b91e0-ecf3-4837-8389-4777dae9d246" (UID: "948b91e0-ecf3-4837-8389-4777dae9d246"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.091840 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948b91e0-ecf3-4837-8389-4777dae9d246-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.746336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q9m7" event={"ID":"948b91e0-ecf3-4837-8389-4777dae9d246","Type":"ContainerDied","Data":"19e60ec182532e91b8d803915b723bd78022158fbaeff0c4a38897ec2523d0d1"} Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.746389 4909 scope.go:117] "RemoveContainer" containerID="e3b2c44c11b8705e591d3aea0b2f6e4531d96d8017fda3c3df517eca8539aaae" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.746555 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q9m7" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.764390 4909 scope.go:117] "RemoveContainer" containerID="59e237be32ad828e5870edd1316fefe38bd3c2ef0f1a454bfe2db72a3afb8b05" Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.778624 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.785337 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5q9m7"] Dec 01 10:34:38 crc kubenswrapper[4909]: I1201 10:34:38.802917 4909 scope.go:117] "RemoveContainer" containerID="97b0c352a4577d4cf098c65994afa513a05696c212a4ed902c6c536c7f04536c" Dec 01 10:34:39 crc kubenswrapper[4909]: I1201 10:34:39.265050 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" path="/var/lib/kubelet/pods/948b91e0-ecf3-4837-8389-4777dae9d246/volumes" Dec 01 10:34:54 crc kubenswrapper[4909]: I1201 10:34:54.879752 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerName="oauth-openshift" containerID="cri-o://ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40" gracePeriod=15 Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.229671 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281208 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54b5875b97-v58xp"] Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281462 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281477 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281494 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281503 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281514 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281522 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281531 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281539 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281554 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281562 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281575 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281582 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281595 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281603 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="extract-content" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281612 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281619 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281631 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerName="oauth-openshift" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281638 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerName="oauth-openshift" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281649 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281697 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281712 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281719 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="extract-utilities" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281732 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281739 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.281749 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281757 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281937 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ef24d9-67de-4a08-82ff-092b66feb19f" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281961 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="948b91e0-ecf3-4837-8389-4777dae9d246" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281973 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5f070e-cd41-4dd5-93ad-0e73b96d3858" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281984 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfb46e1-7822-4508-ac16-d71780bcfe30" containerName="registry-server" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.281997 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerName="oauth-openshift" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.282456 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54b5875b97-v58xp"] Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.282548 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424193 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424217 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrh4\" (UniqueName: \"kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424246 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424382 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424403 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424423 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424454 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424708 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.424481 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.425593 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.425665 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session\") pod \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\" (UID: \"be292a43-f2dc-44e5-8d2f-5b540b79ff6a\") " Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.425765 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.425952 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426117 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426209 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-session\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426243 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426249 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-dir\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426470 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-policies\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426626 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m7mc\" (UniqueName: \"kubernetes.io/projected/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-kube-api-access-9m7mc\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426742 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426816 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426836 4909 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.426851 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.427368 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.427656 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.430601 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.431235 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.431725 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.431935 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.432219 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.435942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4" (OuterVolumeSpecName: "kube-api-access-fxrh4") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "kube-api-access-fxrh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.438084 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.439391 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.439437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "be292a43-f2dc-44e5-8d2f-5b540b79ff6a" (UID: "be292a43-f2dc-44e5-8d2f-5b540b79ff6a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.528166 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-dir\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.528488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-policies\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.528338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-dir\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.528657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529171 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m7mc\" (UniqueName: \"kubernetes.io/projected/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-kube-api-access-9m7mc\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529566 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-audit-policies\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529649 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-session\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529692 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529905 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529933 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529949 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529964 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.529982 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530002 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530017 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530031 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530044 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrh4\" (UniqueName: \"kubernetes.io/projected/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-kube-api-access-fxrh4\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530059 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/be292a43-f2dc-44e5-8d2f-5b540b79ff6a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.530686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.532537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.533207 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.533586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-session\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.533963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.534091 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.534255 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.534297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.534613 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.545052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m7mc\" (UniqueName: \"kubernetes.io/projected/dc7ec8be-a9e5-4789-bfef-b71faf9f45b9-kube-api-access-9m7mc\") pod \"oauth-openshift-54b5875b97-v58xp\" (UID: \"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9\") " pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.596750 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.859669 4909 generic.go:334] "Generic (PLEG): container finished" podID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" containerID="ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40" exitCode=0 Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.859710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" event={"ID":"be292a43-f2dc-44e5-8d2f-5b540b79ff6a","Type":"ContainerDied","Data":"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40"} Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.859745 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" event={"ID":"be292a43-f2dc-44e5-8d2f-5b540b79ff6a","Type":"ContainerDied","Data":"428fbe4e4f0563cf1f36948b6b4a0eb92c866c0f294c2349fd7c8e65dd00c525"} Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.859765 4909 scope.go:117] "RemoveContainer" containerID="ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.859784 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hpz48" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.877730 4909 scope.go:117] "RemoveContainer" containerID="ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40" Dec 01 10:34:55 crc kubenswrapper[4909]: E1201 10:34:55.878074 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40\": container with ID starting with ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40 not found: ID does not exist" containerID="ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.878110 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40"} err="failed to get container status \"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40\": rpc error: code = NotFound desc = could not find container \"ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40\": container with ID starting with ad8b9c8852b3410ee446992e42ce412a9dea8015024e0176175e2a742d443d40 not found: ID does not exist" Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.890192 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:34:55 crc kubenswrapper[4909]: I1201 10:34:55.893343 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hpz48"] Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.000565 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54b5875b97-v58xp"] Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.868058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" event={"ID":"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9","Type":"ContainerStarted","Data":"3ea5924fcbe4c11fc94d12819ec64c8d7bc1fae744b4577b3b1c2ae831eb5e17"} Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.868413 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" event={"ID":"dc7ec8be-a9e5-4789-bfef-b71faf9f45b9","Type":"ContainerStarted","Data":"371baafc078ce83ed748591a5a24f3cc4f0a5766fb98ec7bce41e68198b690f8"} Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.868626 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.873497 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" Dec 01 10:34:56 crc kubenswrapper[4909]: I1201 10:34:56.895145 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54b5875b97-v58xp" podStartSLOduration=27.895120258 podStartE2EDuration="27.895120258s" podCreationTimestamp="2025-12-01 10:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:34:56.888166749 +0000 UTC m=+214.122637647" watchObservedRunningTime="2025-12-01 10:34:56.895120258 +0000 UTC m=+214.129591186" Dec 01 10:34:57 crc kubenswrapper[4909]: I1201 10:34:57.265139 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be292a43-f2dc-44e5-8d2f-5b540b79ff6a" path="/var/lib/kubelet/pods/be292a43-f2dc-44e5-8d2f-5b540b79ff6a/volumes" Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.193789 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.194424 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.194476 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.194977 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.195023 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d" gracePeriod=600 Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.943169 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d" exitCode=0 Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.943261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d"} Dec 01 10:35:06 crc kubenswrapper[4909]: I1201 10:35:06.944018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef"} Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.233045 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.234560 4909 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.234758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.234980 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471" gracePeriod=15 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235047 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45" gracePeriod=15 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235102 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a" gracePeriod=15 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235155 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d" gracePeriod=15 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235129 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd" gracePeriod=15 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235201 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235590 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235616 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235631 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235640 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235652 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235658 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235671 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235678 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235689 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235695 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235704 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235712 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:35:13 crc kubenswrapper[4909]: E1201 10:35:13.235726 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235732 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235956 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235971 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235982 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.235997 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.236009 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.236019 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.269742 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309121 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309175 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309272 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309326 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.309349 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410647 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410796 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410828 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410908 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410909 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.410949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411029 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411270 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411316 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.411419 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.984428 4909 generic.go:334] "Generic (PLEG): container finished" podID="e0b055ce-6813-4bda-80c6-e71788d05982" containerID="73bb607e04773da09307439428d10e438844a5832253823382b19894c3b47bb2" exitCode=0 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.984519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e0b055ce-6813-4bda-80c6-e71788d05982","Type":"ContainerDied","Data":"73bb607e04773da09307439428d10e438844a5832253823382b19894c3b47bb2"} Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.985407 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.987996 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.990385 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.992066 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45" exitCode=0 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.992099 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d" exitCode=0 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.992109 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a" exitCode=0 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.992121 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd" exitCode=2 Dec 01 10:35:13 crc kubenswrapper[4909]: I1201 10:35:13.992170 4909 scope.go:117] "RemoveContainer" containerID="fa4cdfe182d78f6089f128ebea45405ccbcd18376e970649c30d59f5a5321cf1" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.002475 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.254890 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.255560 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.340684 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock\") pod \"e0b055ce-6813-4bda-80c6-e71788d05982\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.341176 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access\") pod \"e0b055ce-6813-4bda-80c6-e71788d05982\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.341331 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir\") pod \"e0b055ce-6813-4bda-80c6-e71788d05982\" (UID: \"e0b055ce-6813-4bda-80c6-e71788d05982\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.342682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock" (OuterVolumeSpecName: "var-lock") pod "e0b055ce-6813-4bda-80c6-e71788d05982" (UID: "e0b055ce-6813-4bda-80c6-e71788d05982"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.344531 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e0b055ce-6813-4bda-80c6-e71788d05982" (UID: "e0b055ce-6813-4bda-80c6-e71788d05982"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.345658 4909 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.345685 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b055ce-6813-4bda-80c6-e71788d05982-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.366333 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e0b055ce-6813-4bda-80c6-e71788d05982" (UID: "e0b055ce-6813-4bda-80c6-e71788d05982"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.447382 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b055ce-6813-4bda-80c6-e71788d05982-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.623025 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.624556 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.625282 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.625901 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649471 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649631 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649700 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649710 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.649742 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.650249 4909 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.650272 4909 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:15 crc kubenswrapper[4909]: I1201 10:35:15.650281 4909 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.012282 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e0b055ce-6813-4bda-80c6-e71788d05982","Type":"ContainerDied","Data":"7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d"} Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.013253 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7630fd4ccc0fe9e78bcfe5048d6ea7f949682a22968798ce985db65a5a9d760d" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.012351 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.015206 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.015830 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471" exitCode=0 Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.015918 4909 scope.go:117] "RemoveContainer" containerID="69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.015963 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.030411 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.030807 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.031218 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.031510 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.031539 4909 scope.go:117] "RemoveContainer" containerID="e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.043815 4909 scope.go:117] "RemoveContainer" containerID="c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.055902 4909 scope.go:117] "RemoveContainer" containerID="5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.074219 4909 scope.go:117] "RemoveContainer" containerID="da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.087405 4909 scope.go:117] "RemoveContainer" containerID="f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.102128 4909 scope.go:117] "RemoveContainer" containerID="69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.102598 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\": container with ID starting with 69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45 not found: ID does not exist" containerID="69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.102634 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45"} err="failed to get container status \"69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\": rpc error: code = NotFound desc = could not find container \"69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45\": container with ID starting with 69a427aef5140691990a12fc3b05310a02143fa0fd92730da771d553a955af45 not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.102662 4909 scope.go:117] "RemoveContainer" containerID="e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.103071 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\": container with ID starting with e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d not found: ID does not exist" containerID="e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103094 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d"} err="failed to get container status \"e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\": rpc error: code = NotFound desc = could not find container \"e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d\": container with ID starting with e8b2f0fdf0900c7d4587b6cd91285e5d8e2cf247930d77831497b6eabf01ba9d not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103110 4909 scope.go:117] "RemoveContainer" containerID="c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.103339 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\": container with ID starting with c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a not found: ID does not exist" containerID="c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103421 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a"} err="failed to get container status \"c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\": rpc error: code = NotFound desc = could not find container \"c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a\": container with ID starting with c5695352d1a16ce2d98be297e51e9197b20d24dbbb73bbe175a30ce0d957191a not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103495 4909 scope.go:117] "RemoveContainer" containerID="5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.103812 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\": container with ID starting with 5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd not found: ID does not exist" containerID="5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103905 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd"} err="failed to get container status \"5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\": rpc error: code = NotFound desc = could not find container \"5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd\": container with ID starting with 5fc9a79953a95e9547662567ed0513ec511ecad03349c9ad90c85bc057f8d7cd not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.103983 4909 scope.go:117] "RemoveContainer" containerID="da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.104333 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\": container with ID starting with da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471 not found: ID does not exist" containerID="da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.104408 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471"} err="failed to get container status \"da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\": rpc error: code = NotFound desc = could not find container \"da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471\": container with ID starting with da96f9227887e71787e11717ab4cfa02efa4d5e550aba6cf9388dc7137f4c471 not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.104464 4909 scope.go:117] "RemoveContainer" containerID="f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.104814 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\": container with ID starting with f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a not found: ID does not exist" containerID="f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.104848 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a"} err="failed to get container status \"f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\": rpc error: code = NotFound desc = could not find container \"f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a\": container with ID starting with f305d3b1640fdd69c78c82f519bd484657d74b830026c804cb74ede174322c2a not found: ID does not exist" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.918147 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.918778 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.919106 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.919428 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.919716 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:16 crc kubenswrapper[4909]: I1201 10:35:16.919840 4909 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 10:35:16 crc kubenswrapper[4909]: E1201 10:35:16.920146 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="200ms" Dec 01 10:35:17 crc kubenswrapper[4909]: E1201 10:35:17.121402 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="400ms" Dec 01 10:35:17 crc kubenswrapper[4909]: I1201 10:35:17.263979 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 10:35:17 crc kubenswrapper[4909]: E1201 10:35:17.523151 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="800ms" Dec 01 10:35:18 crc kubenswrapper[4909]: E1201 10:35:18.271053 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.72:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:18 crc kubenswrapper[4909]: I1201 10:35:18.272196 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:18 crc kubenswrapper[4909]: E1201 10:35:18.293870 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.72:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d10ff70c01ce5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:35:18.293445861 +0000 UTC m=+235.527916759,LastTimestamp:2025-12-01 10:35:18.293445861 +0000 UTC m=+235.527916759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:35:18 crc kubenswrapper[4909]: E1201 10:35:18.325622 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="1.6s" Dec 01 10:35:19 crc kubenswrapper[4909]: I1201 10:35:19.037890 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7"} Dec 01 10:35:19 crc kubenswrapper[4909]: I1201 10:35:19.038627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"19b1f67c5754dccbc1051d8776047d601dbe107be422b956da14c13cf24f5272"} Dec 01 10:35:19 crc kubenswrapper[4909]: I1201 10:35:19.039332 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:19 crc kubenswrapper[4909]: E1201 10:35:19.039399 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.72:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:35:19 crc kubenswrapper[4909]: E1201 10:35:19.926917 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="3.2s" Dec 01 10:35:22 crc kubenswrapper[4909]: E1201 10:35:22.582815 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.72:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d10ff70c01ce5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:35:18.293445861 +0000 UTC m=+235.527916759,LastTimestamp:2025-12-01 10:35:18.293445861 +0000 UTC m=+235.527916759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:35:23 crc kubenswrapper[4909]: E1201 10:35:23.128450 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.72:6443: connect: connection refused" interval="6.4s" Dec 01 10:35:23 crc kubenswrapper[4909]: I1201 10:35:23.259495 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:25 crc kubenswrapper[4909]: I1201 10:35:25.256350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:25 crc kubenswrapper[4909]: I1201 10:35:25.257634 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:25 crc kubenswrapper[4909]: I1201 10:35:25.274622 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:25 crc kubenswrapper[4909]: I1201 10:35:25.274661 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:25 crc kubenswrapper[4909]: E1201 10:35:25.275144 4909 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:25 crc kubenswrapper[4909]: I1201 10:35:25.275682 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:25 crc kubenswrapper[4909]: W1201 10:35:25.309842 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-12b126e24f8a3e615916e4f33ee7ce970ffcfd3c6f670a1f376b39a6f5c71165 WatchSource:0}: Error finding container 12b126e24f8a3e615916e4f33ee7ce970ffcfd3c6f670a1f376b39a6f5c71165: Status 404 returned error can't find the container with id 12b126e24f8a3e615916e4f33ee7ce970ffcfd3c6f670a1f376b39a6f5c71165 Dec 01 10:35:25 crc kubenswrapper[4909]: E1201 10:35:25.640004 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-3ef28631cb6efd345c720464443d07f23e4122cca0e7b4cae100d55fd9769beb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-conmon-3ef28631cb6efd345c720464443d07f23e4122cca0e7b4cae100d55fd9769beb.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.087528 4909 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3ef28631cb6efd345c720464443d07f23e4122cca0e7b4cae100d55fd9769beb" exitCode=0 Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.087623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3ef28631cb6efd345c720464443d07f23e4122cca0e7b4cae100d55fd9769beb"} Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.087966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"12b126e24f8a3e615916e4f33ee7ce970ffcfd3c6f670a1f376b39a6f5c71165"} Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.088446 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.088564 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.088808 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:26 crc kubenswrapper[4909]: E1201 10:35:26.089336 4909 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.091017 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.091110 4909 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a" exitCode=1 Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.091142 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a"} Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.091662 4909 scope.go:117] "RemoveContainer" containerID="95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.092346 4909 status_manager.go:851] "Failed to get status for pod" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:26 crc kubenswrapper[4909]: I1201 10:35:26.092952 4909 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.72:6443: connect: connection refused" Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.100658 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a92d462bd9da8d3596ad2199c738142d0554084f7c4c8af469485ded61414a11"} Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.100987 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a9a0f61723858c20288c23ed162c7f43529eda18ca937b32e6454b6c42552655"} Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.101010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"03edee07f506b5802411a08b780d7bf89fe4a8b9d79fea7dbdb9b1585267009c"} Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.101019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2206ed9bdf02f9ce4bccb4f40f4112a40fc09d934448e02449b39640fd99d878"} Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.105772 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:35:27 crc kubenswrapper[4909]: I1201 10:35:27.105817 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79464f59dfa256499beea30fd07c026b5766ee7368c37ceedda194a19619c902"} Dec 01 10:35:28 crc kubenswrapper[4909]: I1201 10:35:28.115657 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02f756b86ac9293d269ee47ae5b33daebc9b5988de8de4e54bb9a0f3c4703c81"} Dec 01 10:35:28 crc kubenswrapper[4909]: I1201 10:35:28.116380 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:28 crc kubenswrapper[4909]: I1201 10:35:28.116441 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:29 crc kubenswrapper[4909]: I1201 10:35:29.401475 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:30 crc kubenswrapper[4909]: I1201 10:35:30.276767 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:30 crc kubenswrapper[4909]: I1201 10:35:30.277019 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:30 crc kubenswrapper[4909]: I1201 10:35:30.282121 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:33 crc kubenswrapper[4909]: I1201 10:35:33.129113 4909 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:33 crc kubenswrapper[4909]: I1201 10:35:33.266835 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4d2194e1-08e8-4d4e-bd4a-b43fab06757c" Dec 01 10:35:34 crc kubenswrapper[4909]: I1201 10:35:34.163413 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:34 crc kubenswrapper[4909]: I1201 10:35:34.163510 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:34 crc kubenswrapper[4909]: I1201 10:35:34.163531 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:34 crc kubenswrapper[4909]: I1201 10:35:34.167200 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:34 crc kubenswrapper[4909]: I1201 10:35:34.167528 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4d2194e1-08e8-4d4e-bd4a-b43fab06757c" Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.166939 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.167291 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.169821 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.174500 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.174532 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="43b14afb-15c2-4260-9e25-008f9466724b" Dec 01 10:35:35 crc kubenswrapper[4909]: I1201 10:35:35.177718 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4d2194e1-08e8-4d4e-bd4a-b43fab06757c" Dec 01 10:35:43 crc kubenswrapper[4909]: I1201 10:35:43.044268 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:35:43 crc kubenswrapper[4909]: I1201 10:35:43.707163 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:35:43 crc kubenswrapper[4909]: I1201 10:35:43.986325 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:35:44 crc kubenswrapper[4909]: I1201 10:35:44.327318 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:35:44 crc kubenswrapper[4909]: I1201 10:35:44.707113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:35:44 crc kubenswrapper[4909]: I1201 10:35:44.817452 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:35:44 crc kubenswrapper[4909]: I1201 10:35:44.922002 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:35:44 crc kubenswrapper[4909]: I1201 10:35:44.953582 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.038666 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.167292 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.167391 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.274978 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.286777 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.346266 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.527988 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.625511 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.908371 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:35:45 crc kubenswrapper[4909]: I1201 10:35:45.998658 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.039957 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.129250 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.228582 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.362793 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.394487 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.417259 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.450723 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.660249 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.713426 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.734922 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.813656 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.863506 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:35:46 crc kubenswrapper[4909]: I1201 10:35:46.868604 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.050647 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.103493 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.125041 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.135136 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.215802 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.256272 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.319686 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.406459 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.545157 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.622464 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.785868 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.861949 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.897151 4909 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.910396 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.915310 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:35:47 crc kubenswrapper[4909]: I1201 10:35:47.965566 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.088551 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.091907 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.099571 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.202898 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.205981 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.233749 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.243403 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.380291 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.386784 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.478420 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.493832 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.637893 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.691801 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.757463 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.758210 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.835282 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.853460 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.875534 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.918849 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.968492 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:35:48 crc kubenswrapper[4909]: I1201 10:35:48.980411 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.117005 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.170284 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.215553 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.245979 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.295462 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.309064 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.332480 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.333051 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.614612 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.668522 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.670329 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.696605 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.726213 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.743596 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.778407 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.846782 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 10:35:49 crc kubenswrapper[4909]: I1201 10:35:49.898487 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.046528 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.064854 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.136894 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.147939 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.162938 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.292682 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.296185 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.403243 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.419704 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.628580 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.638309 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.642234 4909 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.650607 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.650703 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.658127 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.663555 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.669622 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.671646 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.67161744 podStartE2EDuration="17.67161744s" podCreationTimestamp="2025-12-01 10:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:35:50.669532132 +0000 UTC m=+267.904003070" watchObservedRunningTime="2025-12-01 10:35:50.67161744 +0000 UTC m=+267.906088358" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.717772 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.719840 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.754833 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.784548 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.785652 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.850693 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.865122 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.879275 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.883503 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.924943 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.964902 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.969893 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:35:50 crc kubenswrapper[4909]: I1201 10:35:50.991461 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.047688 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.073389 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.116432 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.121772 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.157593 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.237155 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.252929 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.292432 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.293628 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.413713 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.413718 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.430653 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.450043 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.557455 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.571496 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:35:51 crc kubenswrapper[4909]: I1201 10:35:51.733875 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.002470 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.002483 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.026026 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.034242 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.169474 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.282150 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.298138 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.329652 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.332349 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.333524 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.382081 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.446803 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.560377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.603029 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.654440 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.770382 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.788208 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.790242 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.802447 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.855440 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.890212 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:35:52 crc kubenswrapper[4909]: I1201 10:35:52.953048 4909 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.036346 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.097254 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.106093 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.107621 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.223756 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.316698 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.390237 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.597195 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.635906 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.684690 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.760893 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.805732 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:35:53 crc kubenswrapper[4909]: I1201 10:35:53.847258 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.041061 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.086260 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.132372 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.143848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.166577 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.194255 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.337326 4909 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.357336 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.437749 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.437796 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.544117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.589779 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.619640 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.657209 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.692934 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.722891 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:35:54 crc kubenswrapper[4909]: I1201 10:35:54.908805 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.043684 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.085419 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.167529 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.167618 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.167727 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.168861 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"79464f59dfa256499beea30fd07c026b5766ee7368c37ceedda194a19619c902"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.169157 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://79464f59dfa256499beea30fd07c026b5766ee7368c37ceedda194a19619c902" gracePeriod=30 Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.182419 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.194542 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.234767 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.275414 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.335637 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.362215 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.405079 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.470078 4909 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.528413 4909 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.528754 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7" gracePeriod=5 Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.810623 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.826440 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.861867 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.896868 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.913654 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.948934 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:35:55 crc kubenswrapper[4909]: I1201 10:35:55.957798 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.032762 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.135291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.148649 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.182462 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.242330 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.249392 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.447289 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.481805 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.587333 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.661948 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.665240 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.701543 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.767120 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.791870 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.825091 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.899130 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.902618 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.962226 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:35:56 crc kubenswrapper[4909]: I1201 10:35:56.991179 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.075828 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.079074 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.119312 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.119810 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrjdh" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="registry-server" containerID="cri-o://1af2589a318e95561f70157a2d3a5202516a4ad8710d6a145c3be18d692ed614" gracePeriod=30 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.128964 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.129230 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2kbh7" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="registry-server" containerID="cri-o://dfb49a5e2014ae6d25046c0c3cdcd5f4014df68e91cef2db9685ffd56840145a" gracePeriod=30 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.144810 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.145161 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" containerID="cri-o://dc9e0b63004535ea8efd390db6ab59c3cd13b7dc2169145c58102485a011cd19" gracePeriod=30 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.149700 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.150023 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bp5bb" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="registry-server" containerID="cri-o://6f24187acb3ed4986040c49c45a832a6bd65adcdccd83b4f759d4741cf8a7d3f" gracePeriod=30 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.155003 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.156637 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-865bh" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="registry-server" containerID="cri-o://867813fd28542dc7c779d6286773dac2e3969e3599c65f731ce4548c2f64b20c" gracePeriod=30 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.270129 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.319658 4909 generic.go:334] "Generic (PLEG): container finished" podID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerID="1af2589a318e95561f70157a2d3a5202516a4ad8710d6a145c3be18d692ed614" exitCode=0 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.319941 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerDied","Data":"1af2589a318e95561f70157a2d3a5202516a4ad8710d6a145c3be18d692ed614"} Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.324699 4909 generic.go:334] "Generic (PLEG): container finished" podID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerID="dfb49a5e2014ae6d25046c0c3cdcd5f4014df68e91cef2db9685ffd56840145a" exitCode=0 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.324773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerDied","Data":"dfb49a5e2014ae6d25046c0c3cdcd5f4014df68e91cef2db9685ffd56840145a"} Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.327987 4909 generic.go:334] "Generic (PLEG): container finished" podID="8949a76d-2e07-49ff-888a-0ca883b56996" containerID="6f24187acb3ed4986040c49c45a832a6bd65adcdccd83b4f759d4741cf8a7d3f" exitCode=0 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.328138 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerDied","Data":"6f24187acb3ed4986040c49c45a832a6bd65adcdccd83b4f759d4741cf8a7d3f"} Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.329771 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerID="dc9e0b63004535ea8efd390db6ab59c3cd13b7dc2169145c58102485a011cd19" exitCode=0 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.329844 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" event={"ID":"ac0e24f9-2a35-4c96-b694-472eab5c4f15","Type":"ContainerDied","Data":"dc9e0b63004535ea8efd390db6ab59c3cd13b7dc2169145c58102485a011cd19"} Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.337214 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerID="867813fd28542dc7c779d6286773dac2e3969e3599c65f731ce4548c2f64b20c" exitCode=0 Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.337297 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerDied","Data":"867813fd28542dc7c779d6286773dac2e3969e3599c65f731ce4548c2f64b20c"} Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.347030 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.517822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.579286 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.627705 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.633870 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.676694 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.681205 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.697044 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.700308 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.723192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjr6\" (UniqueName: \"kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6\") pod \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.723284 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content\") pod \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.723322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities\") pod \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\" (UID: \"87a9da3f-7bc4-42e3-ade7-b4728615a67b\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.724475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities" (OuterVolumeSpecName: "utilities") pod "87a9da3f-7bc4-42e3-ade7-b4728615a67b" (UID: "87a9da3f-7bc4-42e3-ade7-b4728615a67b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.728478 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6" (OuterVolumeSpecName: "kube-api-access-7xjr6") pod "87a9da3f-7bc4-42e3-ade7-b4728615a67b" (UID: "87a9da3f-7bc4-42e3-ade7-b4728615a67b"). InnerVolumeSpecName "kube-api-access-7xjr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.780917 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87a9da3f-7bc4-42e3-ade7-b4728615a67b" (UID: "87a9da3f-7bc4-42e3-ade7-b4728615a67b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.781471 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834187 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content\") pod \"13706ba3-7e21-4e1f-ac26-65e9f781d809\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834330 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca\") pod \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrz6s\" (UniqueName: \"kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s\") pod \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb8lv\" (UniqueName: \"kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv\") pod \"13706ba3-7e21-4e1f-ac26-65e9f781d809\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834438 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content\") pod \"8949a76d-2e07-49ff-888a-0ca883b56996\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities\") pod \"8949a76d-2e07-49ff-888a-0ca883b56996\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44pg\" (UniqueName: \"kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg\") pod \"8949a76d-2e07-49ff-888a-0ca883b56996\" (UID: \"8949a76d-2e07-49ff-888a-0ca883b56996\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834562 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities\") pod \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834591 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities\") pod \"13706ba3-7e21-4e1f-ac26-65e9f781d809\" (UID: \"13706ba3-7e21-4e1f-ac26-65e9f781d809\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834624 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics\") pod \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834669 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25zlp\" (UniqueName: \"kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp\") pod \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\" (UID: \"ac0e24f9-2a35-4c96-b694-472eab5c4f15\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834690 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content\") pod \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\" (UID: \"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65\") " Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.834990 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.835015 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjr6\" (UniqueName: \"kubernetes.io/projected/87a9da3f-7bc4-42e3-ade7-b4728615a67b-kube-api-access-7xjr6\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.835031 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87a9da3f-7bc4-42e3-ade7-b4728615a67b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.840986 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities" (OuterVolumeSpecName: "utilities") pod "8949a76d-2e07-49ff-888a-0ca883b56996" (UID: "8949a76d-2e07-49ff-888a-0ca883b56996"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.841069 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities" (OuterVolumeSpecName: "utilities") pod "13706ba3-7e21-4e1f-ac26-65e9f781d809" (UID: "13706ba3-7e21-4e1f-ac26-65e9f781d809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.842446 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ac0e24f9-2a35-4c96-b694-472eab5c4f15" (UID: "ac0e24f9-2a35-4c96-b694-472eab5c4f15"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.843139 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities" (OuterVolumeSpecName: "utilities") pod "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" (UID: "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.844961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv" (OuterVolumeSpecName: "kube-api-access-vb8lv") pod "13706ba3-7e21-4e1f-ac26-65e9f781d809" (UID: "13706ba3-7e21-4e1f-ac26-65e9f781d809"). InnerVolumeSpecName "kube-api-access-vb8lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.845126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg" (OuterVolumeSpecName: "kube-api-access-g44pg") pod "8949a76d-2e07-49ff-888a-0ca883b56996" (UID: "8949a76d-2e07-49ff-888a-0ca883b56996"). InnerVolumeSpecName "kube-api-access-g44pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.845496 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp" (OuterVolumeSpecName: "kube-api-access-25zlp") pod "ac0e24f9-2a35-4c96-b694-472eab5c4f15" (UID: "ac0e24f9-2a35-4c96-b694-472eab5c4f15"). InnerVolumeSpecName "kube-api-access-25zlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.845860 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ac0e24f9-2a35-4c96-b694-472eab5c4f15" (UID: "ac0e24f9-2a35-4c96-b694-472eab5c4f15"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.847106 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s" (OuterVolumeSpecName: "kube-api-access-hrz6s") pod "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" (UID: "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65"). InnerVolumeSpecName "kube-api-access-hrz6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.860996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8949a76d-2e07-49ff-888a-0ca883b56996" (UID: "8949a76d-2e07-49ff-888a-0ca883b56996"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.909587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13706ba3-7e21-4e1f-ac26-65e9f781d809" (UID: "13706ba3-7e21-4e1f-ac26-65e9f781d809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937107 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937544 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937559 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrz6s\" (UniqueName: \"kubernetes.io/projected/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-kube-api-access-hrz6s\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937572 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937584 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb8lv\" (UniqueName: \"kubernetes.io/projected/13706ba3-7e21-4e1f-ac26-65e9f781d809-kube-api-access-vb8lv\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937595 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8949a76d-2e07-49ff-888a-0ca883b56996-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937607 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44pg\" (UniqueName: \"kubernetes.io/projected/8949a76d-2e07-49ff-888a-0ca883b56996-kube-api-access-g44pg\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937618 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937629 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13706ba3-7e21-4e1f-ac26-65e9f781d809-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937638 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ac0e24f9-2a35-4c96-b694-472eab5c4f15-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.937649 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25zlp\" (UniqueName: \"kubernetes.io/projected/ac0e24f9-2a35-4c96-b694-472eab5c4f15-kube-api-access-25zlp\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:57 crc kubenswrapper[4909]: I1201 10:35:57.948381 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" (UID: "6b6f8e71-e11d-47d5-a8a2-c42e455f3f65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.038922 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.119329 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.159765 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.331188 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.344531 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.350031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865bh" event={"ID":"6b6f8e71-e11d-47d5-a8a2-c42e455f3f65","Type":"ContainerDied","Data":"230be9764671fab59f0269309a2f45293a973d49f740ec2f2c6384ae79b5e525"} Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.350094 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865bh" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.350103 4909 scope.go:117] "RemoveContainer" containerID="867813fd28542dc7c779d6286773dac2e3969e3599c65f731ce4548c2f64b20c" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.351934 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrjdh" event={"ID":"87a9da3f-7bc4-42e3-ade7-b4728615a67b","Type":"ContainerDied","Data":"64ec9807945c76fdf082c0bc405c4533733556fdf8561ec020faae6ac6d14da3"} Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.352138 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrjdh" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.357744 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kbh7" event={"ID":"13706ba3-7e21-4e1f-ac26-65e9f781d809","Type":"ContainerDied","Data":"d3547de5ac2ba26b2ea1e9a829643d3263482d34525b56fe94a2177d4d63201b"} Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.357761 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kbh7" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.362775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp5bb" event={"ID":"8949a76d-2e07-49ff-888a-0ca883b56996","Type":"ContainerDied","Data":"31b4a882eaabc0f6d05bb7788194452ae234f2fa96ac792bc5b626dc26973ad1"} Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.363097 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp5bb" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.366854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" event={"ID":"ac0e24f9-2a35-4c96-b694-472eab5c4f15","Type":"ContainerDied","Data":"04510736f49555a97a838ac21ad54448dcaf67b28ad520bd569e391cf80f8c91"} Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.367075 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7cd6g" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.377474 4909 scope.go:117] "RemoveContainer" containerID="3c535fb969561ef6766ab6e6272e05ea6e7d4be37c33557cea159b6408cbccc3" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.379555 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.384782 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.394813 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.397087 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrjdh"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.416020 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.417452 4909 scope.go:117] "RemoveContainer" containerID="ce53868867776b25aa665913ff7c7c2ed3819b541622234d73160d98b3a0f739" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.434057 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.434698 4909 scope.go:117] "RemoveContainer" containerID="1af2589a318e95561f70157a2d3a5202516a4ad8710d6a145c3be18d692ed614" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.442762 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2kbh7"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.457377 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.464032 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-865bh"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.464375 4909 scope.go:117] "RemoveContainer" containerID="aec14c1b01fb03dacfbc18d7de403c3967e48a21a5ea46b9ca748facc5d5dbdf" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.471716 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.474861 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp5bb"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.478312 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.483615 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7cd6g"] Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.484588 4909 scope.go:117] "RemoveContainer" containerID="a56ee55a6b289a6e1b43296eb2b7e27ab8c8ba7629814bbdb9c998fd99923da2" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.499779 4909 scope.go:117] "RemoveContainer" containerID="dfb49a5e2014ae6d25046c0c3cdcd5f4014df68e91cef2db9685ffd56840145a" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.521159 4909 scope.go:117] "RemoveContainer" containerID="e29a43058885ac2d3770345b75d1136be54485ef4c077e6926d4236e598df99c" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.537590 4909 scope.go:117] "RemoveContainer" containerID="589c50e36ad19ed34c2680336b679d5251441acf337c2081c99a15de5e632233" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.555362 4909 scope.go:117] "RemoveContainer" containerID="6f24187acb3ed4986040c49c45a832a6bd65adcdccd83b4f759d4741cf8a7d3f" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.573827 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.574134 4909 scope.go:117] "RemoveContainer" containerID="f4573ad3c94a9613a717226e7b597ef72b32432147e51f02a006281cd20fbd50" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.588665 4909 scope.go:117] "RemoveContainer" containerID="60d56e63d7752827222b023745357166e9aaf6e00e594da26dd5962459acac97" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.601997 4909 scope.go:117] "RemoveContainer" containerID="dc9e0b63004535ea8efd390db6ab59c3cd13b7dc2169145c58102485a011cd19" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.673676 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.732386 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.824636 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.840709 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:35:58 crc kubenswrapper[4909]: I1201 10:35:58.918637 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.047226 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.050772 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.066946 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.067754 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.265467 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" path="/var/lib/kubelet/pods/13706ba3-7e21-4e1f-ac26-65e9f781d809/volumes" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.266123 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" path="/var/lib/kubelet/pods/6b6f8e71-e11d-47d5-a8a2-c42e455f3f65/volumes" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.266676 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" path="/var/lib/kubelet/pods/87a9da3f-7bc4-42e3-ade7-b4728615a67b/volumes" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.267657 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" path="/var/lib/kubelet/pods/8949a76d-2e07-49ff-888a-0ca883b56996/volumes" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.268249 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" path="/var/lib/kubelet/pods/ac0e24f9-2a35-4c96-b694-472eab5c4f15/volumes" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.372694 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.513422 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.540786 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.793975 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.867386 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:35:59 crc kubenswrapper[4909]: I1201 10:35:59.903715 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:36:00 crc kubenswrapper[4909]: I1201 10:36:00.269499 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:36:00 crc kubenswrapper[4909]: I1201 10:36:00.306639 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:36:00 crc kubenswrapper[4909]: I1201 10:36:00.531580 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:36:00 crc kubenswrapper[4909]: I1201 10:36:00.732911 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 10:36:00 crc kubenswrapper[4909]: I1201 10:36:00.852393 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.100378 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.100486 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.190160 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.190371 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291621 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291730 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291766 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291815 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291855 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.291856 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.292098 4909 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.292114 4909 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.292125 4909 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.292136 4909 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.300938 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.388984 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.389049 4909 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7" exitCode=137 Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.389100 4909 scope.go:117] "RemoveContainer" containerID="e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.389103 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.393373 4909 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.413673 4909 scope.go:117] "RemoveContainer" containerID="e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7" Dec 01 10:36:01 crc kubenswrapper[4909]: E1201 10:36:01.414943 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7\": container with ID starting with e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7 not found: ID does not exist" containerID="e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.414988 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7"} err="failed to get container status \"e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7\": rpc error: code = NotFound desc = could not find container \"e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7\": container with ID starting with e2b9105110540bc39f6f59be1ce595dee0cfa49a7bfb404739d0ca9b599379c7 not found: ID does not exist" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.478582 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:36:01 crc kubenswrapper[4909]: I1201 10:36:01.897269 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:36:03 crc kubenswrapper[4909]: I1201 10:36:03.263927 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 10:36:23 crc kubenswrapper[4909]: I1201 10:36:23.062631 4909 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 01 10:36:25 crc kubenswrapper[4909]: I1201 10:36:25.521801 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 10:36:25 crc kubenswrapper[4909]: I1201 10:36:25.524363 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:36:25 crc kubenswrapper[4909]: I1201 10:36:25.524489 4909 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="79464f59dfa256499beea30fd07c026b5766ee7368c37ceedda194a19619c902" exitCode=137 Dec 01 10:36:25 crc kubenswrapper[4909]: I1201 10:36:25.524609 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"79464f59dfa256499beea30fd07c026b5766ee7368c37ceedda194a19619c902"} Dec 01 10:36:25 crc kubenswrapper[4909]: I1201 10:36:25.524701 4909 scope.go:117] "RemoveContainer" containerID="95a7c61a90ae01d6f0208975f8bf98edb6a8d1274d95c695ab902d5d8f20317a" Dec 01 10:36:26 crc kubenswrapper[4909]: I1201 10:36:26.534097 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 10:36:26 crc kubenswrapper[4909]: I1201 10:36:26.536615 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ad6bee94cb1560b3d381c9a0a373db23d949e0ed39545f9f21c1712d6850606"} Dec 01 10:36:29 crc kubenswrapper[4909]: I1201 10:36:29.401800 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.154614 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.154967 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.154988 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155010 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155019 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155028 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155039 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155051 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155062 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155074 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155081 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155095 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155102 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155116 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155122 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155130 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155137 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="extract-utilities" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155147 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155154 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155166 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155174 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155185 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155379 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155392 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155399 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155410 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155418 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155430 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155437 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="extract-content" Dec 01 10:36:31 crc kubenswrapper[4909]: E1201 10:36:31.155447 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" containerName="installer" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155455 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" containerName="installer" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155593 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b055ce-6813-4bda-80c6-e71788d05982" containerName="installer" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155613 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a9da3f-7bc4-42e3-ade7-b4728615a67b" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155624 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6f8e71-e11d-47d5-a8a2-c42e455f3f65" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155635 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0e24f9-2a35-4c96-b694-472eab5c4f15" containerName="marketplace-operator" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155648 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="13706ba3-7e21-4e1f-ac26-65e9f781d809" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155661 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8949a76d-2e07-49ff-888a-0ca883b56996" containerName="registry-server" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.155673 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.156647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.160820 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.160846 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.160865 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.170577 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.262091 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.262275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.262308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2xzg\" (UniqueName: \"kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.363398 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.363527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.363574 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2xzg\" (UniqueName: \"kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.363975 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.363984 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.384848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2xzg\" (UniqueName: \"kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg\") pod \"redhat-operators-z5f7f\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.483617 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.549409 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ffblz"] Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.553383 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.559187 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.567953 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffblz"] Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.568403 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-utilities\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.568461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-catalog-content\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.568507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzxx\" (UniqueName: \"kubernetes.io/projected/2d64a6cb-82ae-49da-9518-72b6727de254-kube-api-access-7tzxx\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.670122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-utilities\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.670216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-catalog-content\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.670281 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzxx\" (UniqueName: \"kubernetes.io/projected/2d64a6cb-82ae-49da-9518-72b6727de254-kube-api-access-7tzxx\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.671554 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-catalog-content\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.671576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d64a6cb-82ae-49da-9518-72b6727de254-utilities\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.687945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzxx\" (UniqueName: \"kubernetes.io/projected/2d64a6cb-82ae-49da-9518-72b6727de254-kube-api-access-7tzxx\") pod \"certified-operators-ffblz\" (UID: \"2d64a6cb-82ae-49da-9518-72b6727de254\") " pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.900520 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:36:31 crc kubenswrapper[4909]: I1201 10:36:31.909390 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:31 crc kubenswrapper[4909]: W1201 10:36:31.911014 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2345b837_4d7d_4d0a_b834_cb26782887ed.slice/crio-f49e6be80f4312219fbe0ce2f69b1da86410acd2d4cf1801c4687b1b3e4d63c7 WatchSource:0}: Error finding container f49e6be80f4312219fbe0ce2f69b1da86410acd2d4cf1801c4687b1b3e4d63c7: Status 404 returned error can't find the container with id f49e6be80f4312219fbe0ce2f69b1da86410acd2d4cf1801c4687b1b3e4d63c7 Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.179736 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffblz"] Dec 01 10:36:32 crc kubenswrapper[4909]: W1201 10:36:32.184754 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d64a6cb_82ae_49da_9518_72b6727de254.slice/crio-5709caac140b354179cbb31b3f907553050f1e45ca12125c818aefa164a9a43a WatchSource:0}: Error finding container 5709caac140b354179cbb31b3f907553050f1e45ca12125c818aefa164a9a43a: Status 404 returned error can't find the container with id 5709caac140b354179cbb31b3f907553050f1e45ca12125c818aefa164a9a43a Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.579182 4909 generic.go:334] "Generic (PLEG): container finished" podID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerID="ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d" exitCode=0 Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.579272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerDied","Data":"ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d"} Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.579304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerStarted","Data":"f49e6be80f4312219fbe0ce2f69b1da86410acd2d4cf1801c4687b1b3e4d63c7"} Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.581192 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d64a6cb-82ae-49da-9518-72b6727de254" containerID="d84f95a52fc2ed81b50b9b4764e06c2b649dfdc11482b2a91961359f42792060" exitCode=0 Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.581233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffblz" event={"ID":"2d64a6cb-82ae-49da-9518-72b6727de254","Type":"ContainerDied","Data":"d84f95a52fc2ed81b50b9b4764e06c2b649dfdc11482b2a91961359f42792060"} Dec 01 10:36:32 crc kubenswrapper[4909]: I1201 10:36:32.581261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffblz" event={"ID":"2d64a6cb-82ae-49da-9518-72b6727de254","Type":"ContainerStarted","Data":"5709caac140b354179cbb31b3f907553050f1e45ca12125c818aefa164a9a43a"} Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.158146 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.159799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.164645 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.171826 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.202662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.202780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqhf\" (UniqueName: \"kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.202834 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.306668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.306796 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.306948 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqhf\" (UniqueName: \"kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.307490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.307565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.333695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqhf\" (UniqueName: \"kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf\") pod \"community-operators-knj4t\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.478404 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:33 crc kubenswrapper[4909]: I1201 10:36:33.686619 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.347812 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbcqx"] Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.348985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.351548 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.361199 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbcqx"] Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.521470 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-utilities\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.521563 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-catalog-content\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.521603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzfw\" (UniqueName: \"kubernetes.io/projected/dd102738-71ee-4009-8101-f645325f2de7-kube-api-access-mrzfw\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.597464 4909 generic.go:334] "Generic (PLEG): container finished" podID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerID="bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2" exitCode=0 Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.597545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerDied","Data":"bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2"} Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.598677 4909 generic.go:334] "Generic (PLEG): container finished" podID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerID="abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3" exitCode=0 Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.598728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerDied","Data":"abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3"} Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.598760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerStarted","Data":"e4981b0c628ca871c2336698465a81b1062640c2142f1f7862d8cac9d9a258d1"} Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.622693 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-utilities\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.622771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-catalog-content\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.622814 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrzfw\" (UniqueName: \"kubernetes.io/projected/dd102738-71ee-4009-8101-f645325f2de7-kube-api-access-mrzfw\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.623254 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-utilities\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.623545 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd102738-71ee-4009-8101-f645325f2de7-catalog-content\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.641955 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrzfw\" (UniqueName: \"kubernetes.io/projected/dd102738-71ee-4009-8101-f645325f2de7-kube-api-access-mrzfw\") pod \"redhat-marketplace-cbcqx\" (UID: \"dd102738-71ee-4009-8101-f645325f2de7\") " pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:34 crc kubenswrapper[4909]: I1201 10:36:34.736288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.140228 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbcqx"] Dec 01 10:36:35 crc kubenswrapper[4909]: W1201 10:36:35.162365 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd102738_71ee_4009_8101_f645325f2de7.slice/crio-ba855b4a501643bdfaae07004567bab16d5305f27067fbda1e2a872129bc7324 WatchSource:0}: Error finding container ba855b4a501643bdfaae07004567bab16d5305f27067fbda1e2a872129bc7324: Status 404 returned error can't find the container with id ba855b4a501643bdfaae07004567bab16d5305f27067fbda1e2a872129bc7324 Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.166990 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.174704 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.606981 4909 generic.go:334] "Generic (PLEG): container finished" podID="dd102738-71ee-4009-8101-f645325f2de7" containerID="e93057771a26a6e806eb27b715fb4cae384c309ce11427e69f04734999f2fda8" exitCode=0 Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.607054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbcqx" event={"ID":"dd102738-71ee-4009-8101-f645325f2de7","Type":"ContainerDied","Data":"e93057771a26a6e806eb27b715fb4cae384c309ce11427e69f04734999f2fda8"} Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.607086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbcqx" event={"ID":"dd102738-71ee-4009-8101-f645325f2de7","Type":"ContainerStarted","Data":"ba855b4a501643bdfaae07004567bab16d5305f27067fbda1e2a872129bc7324"} Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.609211 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d64a6cb-82ae-49da-9518-72b6727de254" containerID="324a1e5be1f4f09c3cf7d9dd1c32e1bb28cbc2d38ba30c2f6fedec68af9d4a69" exitCode=0 Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.609244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffblz" event={"ID":"2d64a6cb-82ae-49da-9518-72b6727de254","Type":"ContainerDied","Data":"324a1e5be1f4f09c3cf7d9dd1c32e1bb28cbc2d38ba30c2f6fedec68af9d4a69"} Dec 01 10:36:35 crc kubenswrapper[4909]: I1201 10:36:35.613648 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.615847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffblz" event={"ID":"2d64a6cb-82ae-49da-9518-72b6727de254","Type":"ContainerStarted","Data":"2945b01a7ee60c688a0ddcfacba38c9e3309aad055443fc599d28efcbf476a23"} Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.625541 4909 generic.go:334] "Generic (PLEG): container finished" podID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerID="ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8" exitCode=0 Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.625615 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerDied","Data":"ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8"} Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.629541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerStarted","Data":"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33"} Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.633607 4909 generic.go:334] "Generic (PLEG): container finished" podID="dd102738-71ee-4009-8101-f645325f2de7" containerID="3ef3994b6d793a9c12e504499ec9e08151f369033263fd6abb46a6d7e098505c" exitCode=0 Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.633756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbcqx" event={"ID":"dd102738-71ee-4009-8101-f645325f2de7","Type":"ContainerDied","Data":"3ef3994b6d793a9c12e504499ec9e08151f369033263fd6abb46a6d7e098505c"} Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.636794 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ffblz" podStartSLOduration=1.9888855479999998 podStartE2EDuration="5.636784004s" podCreationTimestamp="2025-12-01 10:36:31 +0000 UTC" firstStartedPulling="2025-12-01 10:36:32.58317951 +0000 UTC m=+309.817650428" lastFinishedPulling="2025-12-01 10:36:36.231077986 +0000 UTC m=+313.465548884" observedRunningTime="2025-12-01 10:36:36.633385782 +0000 UTC m=+313.867856720" watchObservedRunningTime="2025-12-01 10:36:36.636784004 +0000 UTC m=+313.871254902" Dec 01 10:36:36 crc kubenswrapper[4909]: I1201 10:36:36.677530 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z5f7f" podStartSLOduration=2.6384979079999997 podStartE2EDuration="5.677502817s" podCreationTimestamp="2025-12-01 10:36:31 +0000 UTC" firstStartedPulling="2025-12-01 10:36:32.58318481 +0000 UTC m=+309.817655718" lastFinishedPulling="2025-12-01 10:36:35.622189729 +0000 UTC m=+312.856660627" observedRunningTime="2025-12-01 10:36:36.673830826 +0000 UTC m=+313.908301724" watchObservedRunningTime="2025-12-01 10:36:36.677502817 +0000 UTC m=+313.911973715" Dec 01 10:36:37 crc kubenswrapper[4909]: I1201 10:36:37.642611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerStarted","Data":"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e"} Dec 01 10:36:37 crc kubenswrapper[4909]: I1201 10:36:37.645208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbcqx" event={"ID":"dd102738-71ee-4009-8101-f645325f2de7","Type":"ContainerStarted","Data":"0b8c0e5797a183e8235ebccc34617602b8715937ed43b8c3cf4a2b3a2cab9ba7"} Dec 01 10:36:37 crc kubenswrapper[4909]: I1201 10:36:37.672259 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knj4t" podStartSLOduration=2.067072892 podStartE2EDuration="4.672233021s" podCreationTimestamp="2025-12-01 10:36:33 +0000 UTC" firstStartedPulling="2025-12-01 10:36:34.600554834 +0000 UTC m=+311.835025732" lastFinishedPulling="2025-12-01 10:36:37.205714953 +0000 UTC m=+314.440185861" observedRunningTime="2025-12-01 10:36:37.667677662 +0000 UTC m=+314.902148570" watchObservedRunningTime="2025-12-01 10:36:37.672233021 +0000 UTC m=+314.906703949" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.484560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.485182 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.523191 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.540843 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cbcqx" podStartSLOduration=5.931944575 podStartE2EDuration="7.540825809s" podCreationTimestamp="2025-12-01 10:36:34 +0000 UTC" firstStartedPulling="2025-12-01 10:36:35.62132103 +0000 UTC m=+312.855791938" lastFinishedPulling="2025-12-01 10:36:37.230202274 +0000 UTC m=+314.464673172" observedRunningTime="2025-12-01 10:36:37.696049191 +0000 UTC m=+314.930520089" watchObservedRunningTime="2025-12-01 10:36:41.540825809 +0000 UTC m=+318.775296707" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.708492 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.910728 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.910782 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:41 crc kubenswrapper[4909]: I1201 10:36:41.948169 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:42 crc kubenswrapper[4909]: I1201 10:36:42.709176 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ffblz" Dec 01 10:36:43 crc kubenswrapper[4909]: I1201 10:36:43.479107 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:43 crc kubenswrapper[4909]: I1201 10:36:43.479524 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:43 crc kubenswrapper[4909]: I1201 10:36:43.513722 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:43 crc kubenswrapper[4909]: I1201 10:36:43.712032 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knj4t" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.080849 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqgll"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.081793 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.084547 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.084863 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.093174 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqgll"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.094714 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.134421 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cv9gs"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.135292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.142039 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.142290 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" podUID="f8782986-5304-487f-962e-5b2e9233ab75" containerName="controller-manager" containerID="cri-o://35115323614a45c76dd80f9a42b94f05f05b1a84cbb75c1b34f345e5c4976b10" gracePeriod=30 Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.164079 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cv9gs"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.203513 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.203809 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" podUID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" containerName="route-controller-manager" containerID="cri-o://c9b659489dc8280cb0a167868b4067c8ba37afe2d1776f8e45d21527675af945" gracePeriod=30 Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.277779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-bound-sa-token\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.277851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1a97019-8887-4437-af46-0c729260a089-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.277924 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-trusted-ca\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.277961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjxc\" (UniqueName: \"kubernetes.io/projected/6e745987-2227-4479-9ea8-3bf3ce5ba444-kube-api-access-9mjxc\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.277983 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1a97019-8887-4437-af46-0c729260a089-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-registry-tls\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpfsn\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-kube-api-access-rpfsn\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278167 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-registry-certificates\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.278277 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.315799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-bound-sa-token\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1a97019-8887-4437-af46-0c729260a089-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-trusted-ca\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379745 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjxc\" (UniqueName: \"kubernetes.io/projected/6e745987-2227-4479-9ea8-3bf3ce5ba444-kube-api-access-9mjxc\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1a97019-8887-4437-af46-0c729260a089-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-registry-tls\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379843 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpfsn\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-kube-api-access-rpfsn\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-registry-certificates\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.379937 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.381063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f1a97019-8887-4437-af46-0c729260a089-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.381684 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-registry-certificates\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.381818 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1a97019-8887-4437-af46-0c729260a089-trusted-ca\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.382097 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.386047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f1a97019-8887-4437-af46-0c729260a089-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.386099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e745987-2227-4479-9ea8-3bf3ce5ba444-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.403357 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-registry-tls\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.405491 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpfsn\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-kube-api-access-rpfsn\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.408625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1a97019-8887-4437-af46-0c729260a089-bound-sa-token\") pod \"image-registry-66df7c8f76-cv9gs\" (UID: \"f1a97019-8887-4437-af46-0c729260a089\") " pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.409649 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjxc\" (UniqueName: \"kubernetes.io/projected/6e745987-2227-4479-9ea8-3bf3ce5ba444-kube-api-access-9mjxc\") pod \"marketplace-operator-79b997595-hqgll\" (UID: \"6e745987-2227-4479-9ea8-3bf3ce5ba444\") " pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.452692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.649207 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cv9gs"] Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.685493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" event={"ID":"f1a97019-8887-4437-af46-0c729260a089","Type":"ContainerStarted","Data":"2939f254c2dad0718b298fe5e93271c2d940f6ad11f5eb89c006d6e3fe4180cb"} Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.697481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.739002 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.739351 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.784848 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:44 crc kubenswrapper[4909]: I1201 10:36:44.885988 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hqgll"] Dec 01 10:36:44 crc kubenswrapper[4909]: W1201 10:36:44.887632 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e745987_2227_4479_9ea8_3bf3ce5ba444.slice/crio-b5d7af1c5b210a4707b40bbea49f1990de818e1c8c98b26aa0e9561934830459 WatchSource:0}: Error finding container b5d7af1c5b210a4707b40bbea49f1990de818e1c8c98b26aa0e9561934830459: Status 404 returned error can't find the container with id b5d7af1c5b210a4707b40bbea49f1990de818e1c8c98b26aa0e9561934830459 Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.694153 4909 generic.go:334] "Generic (PLEG): container finished" podID="f8782986-5304-487f-962e-5b2e9233ab75" containerID="35115323614a45c76dd80f9a42b94f05f05b1a84cbb75c1b34f345e5c4976b10" exitCode=0 Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.694263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" event={"ID":"f8782986-5304-487f-962e-5b2e9233ab75","Type":"ContainerDied","Data":"35115323614a45c76dd80f9a42b94f05f05b1a84cbb75c1b34f345e5c4976b10"} Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.697169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" event={"ID":"f1a97019-8887-4437-af46-0c729260a089","Type":"ContainerStarted","Data":"d864e3eb3dbbe7f568430cc1b83da387c51efc5f71a1a01c907ee1f2b9e3a3ec"} Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.697571 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.699393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" event={"ID":"6e745987-2227-4479-9ea8-3bf3ce5ba444","Type":"ContainerStarted","Data":"34d74c405cdeba04d6b56c7ff9a28b7936792025dd0c15e4dd1fb7765a72b12f"} Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.699441 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" event={"ID":"6e745987-2227-4479-9ea8-3bf3ce5ba444","Type":"ContainerStarted","Data":"b5d7af1c5b210a4707b40bbea49f1990de818e1c8c98b26aa0e9561934830459"} Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.718614 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" podStartSLOduration=1.718592745 podStartE2EDuration="1.718592745s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:45.717006083 +0000 UTC m=+322.951476981" watchObservedRunningTime="2025-12-01 10:36:45.718592745 +0000 UTC m=+322.953063663" Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.731460 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" podStartSLOduration=1.731430425 podStartE2EDuration="1.731430425s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:45.731334662 +0000 UTC m=+322.965805570" watchObservedRunningTime="2025-12-01 10:36:45.731430425 +0000 UTC m=+322.965901313" Dec 01 10:36:45 crc kubenswrapper[4909]: I1201 10:36:45.753175 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cbcqx" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.504472 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.535320 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:36:46 crc kubenswrapper[4909]: E1201 10:36:46.535652 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8782986-5304-487f-962e-5b2e9233ab75" containerName="controller-manager" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.535676 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8782986-5304-487f-962e-5b2e9233ab75" containerName="controller-manager" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.535790 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8782986-5304-487f-962e-5b2e9233ab75" containerName="controller-manager" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.536352 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.545034 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.610993 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfp5d\" (UniqueName: \"kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d\") pod \"f8782986-5304-487f-962e-5b2e9233ab75\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611120 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config\") pod \"f8782986-5304-487f-962e-5b2e9233ab75\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611198 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca\") pod \"f8782986-5304-487f-962e-5b2e9233ab75\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert\") pod \"f8782986-5304-487f-962e-5b2e9233ab75\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611293 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles\") pod \"f8782986-5304-487f-962e-5b2e9233ab75\" (UID: \"f8782986-5304-487f-962e-5b2e9233ab75\") " Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snlpf\" (UniqueName: \"kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.611912 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.612088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.612123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.612472 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f8782986-5304-487f-962e-5b2e9233ab75" (UID: "f8782986-5304-487f-962e-5b2e9233ab75"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.612797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca" (OuterVolumeSpecName: "client-ca") pod "f8782986-5304-487f-962e-5b2e9233ab75" (UID: "f8782986-5304-487f-962e-5b2e9233ab75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.613015 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config" (OuterVolumeSpecName: "config") pod "f8782986-5304-487f-962e-5b2e9233ab75" (UID: "f8782986-5304-487f-962e-5b2e9233ab75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.618976 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f8782986-5304-487f-962e-5b2e9233ab75" (UID: "f8782986-5304-487f-962e-5b2e9233ab75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.619677 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d" (OuterVolumeSpecName: "kube-api-access-bfp5d") pod "f8782986-5304-487f-962e-5b2e9233ab75" (UID: "f8782986-5304-487f-962e-5b2e9233ab75"). InnerVolumeSpecName "kube-api-access-bfp5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.710103 4909 generic.go:334] "Generic (PLEG): container finished" podID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" containerID="c9b659489dc8280cb0a167868b4067c8ba37afe2d1776f8e45d21527675af945" exitCode=0 Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.710199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" event={"ID":"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8","Type":"ContainerDied","Data":"c9b659489dc8280cb0a167868b4067c8ba37afe2d1776f8e45d21527675af945"} Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.711832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" event={"ID":"f8782986-5304-487f-962e-5b2e9233ab75","Type":"ContainerDied","Data":"432d34971d4094e88a2190899f31113034076f54ae18369037abc57d19e79dee"} Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.711935 4909 scope.go:117] "RemoveContainer" containerID="35115323614a45c76dd80f9a42b94f05f05b1a84cbb75c1b34f345e5c4976b10" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.711838 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtkzb" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.712024 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.713457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.713516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.713551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.713599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.713689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snlpf\" (UniqueName: \"kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.714450 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.714861 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.715659 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.715692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.716018 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8782986-5304-487f-962e-5b2e9233ab75-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.716058 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8782986-5304-487f-962e-5b2e9233ab75-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.716077 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfp5d\" (UniqueName: \"kubernetes.io/projected/f8782986-5304-487f-962e-5b2e9233ab75-kube-api-access-bfp5d\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.716595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.716741 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hqgll" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.726355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.737139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snlpf\" (UniqueName: \"kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf\") pod \"controller-manager-75c57f8b8b-nnt7b\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.789690 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.794023 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtkzb"] Dec 01 10:36:46 crc kubenswrapper[4909]: I1201 10:36:46.856494 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.269112 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8782986-5304-487f-962e-5b2e9233ab75" path="/var/lib/kubelet/pods/f8782986-5304-487f-962e-5b2e9233ab75/volumes" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.269848 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:36:47 crc kubenswrapper[4909]: W1201 10:36:47.278258 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3a1d8a_3454_40c1_a396_aae5aadbb9bb.slice/crio-4b0ea549c1ecc688a47a4d767e766b0e573d0fbb313a5bdedcaee20956e13721 WatchSource:0}: Error finding container 4b0ea549c1ecc688a47a4d767e766b0e573d0fbb313a5bdedcaee20956e13721: Status 404 returned error can't find the container with id 4b0ea549c1ecc688a47a4d767e766b0e573d0fbb313a5bdedcaee20956e13721 Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.279162 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.326242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert\") pod \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.326322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config\") pod \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.326371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4crbr\" (UniqueName: \"kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr\") pod \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.326465 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca\") pod \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\" (UID: \"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8\") " Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.327343 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" (UID: "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.327450 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config" (OuterVolumeSpecName: "config") pod "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" (UID: "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.332268 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr" (OuterVolumeSpecName: "kube-api-access-4crbr") pod "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" (UID: "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8"). InnerVolumeSpecName "kube-api-access-4crbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.335549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" (UID: "2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.428970 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.429018 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.429029 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4crbr\" (UniqueName: \"kubernetes.io/projected/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-kube-api-access-4crbr\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.429038 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.720245 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" event={"ID":"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb","Type":"ContainerStarted","Data":"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9"} Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.720743 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.720843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" event={"ID":"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb","Type":"ContainerStarted","Data":"4b0ea549c1ecc688a47a4d767e766b0e573d0fbb313a5bdedcaee20956e13721"} Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.721603 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" event={"ID":"2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8","Type":"ContainerDied","Data":"cc18a219bba8e8289b2601fff0491ee707e25a0f2316dcb2581ef2f14fe296d9"} Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.721682 4909 scope.go:117] "RemoveContainer" containerID="c9b659489dc8280cb0a167868b4067c8ba37afe2d1776f8e45d21527675af945" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.721618 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.729823 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.742714 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" podStartSLOduration=3.742681988 podStartE2EDuration="3.742681988s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:47.741612863 +0000 UTC m=+324.976083771" watchObservedRunningTime="2025-12-01 10:36:47.742681988 +0000 UTC m=+324.977152916" Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.758188 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:36:47 crc kubenswrapper[4909]: I1201 10:36:47.787765 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j6zbf"] Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.907712 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t"] Dec 01 10:36:48 crc kubenswrapper[4909]: E1201 10:36:48.908237 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" containerName="route-controller-manager" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.908251 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" containerName="route-controller-manager" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.908375 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" containerName="route-controller-manager" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.908823 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.910944 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.910969 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.911039 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.911438 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.911583 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.911594 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.920131 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t"] Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.950687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmzk\" (UniqueName: \"kubernetes.io/projected/e4005684-ff4d-4512-af85-83bfdf665451-kube-api-access-stmzk\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.950745 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4005684-ff4d-4512-af85-83bfdf665451-serving-cert\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.950801 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-config\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:48 crc kubenswrapper[4909]: I1201 10:36:48.950822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-client-ca\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.052423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-config\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.052474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-client-ca\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.052527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmzk\" (UniqueName: \"kubernetes.io/projected/e4005684-ff4d-4512-af85-83bfdf665451-kube-api-access-stmzk\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.052563 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4005684-ff4d-4512-af85-83bfdf665451-serving-cert\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.053567 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-client-ca\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.053649 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4005684-ff4d-4512-af85-83bfdf665451-config\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.058292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4005684-ff4d-4512-af85-83bfdf665451-serving-cert\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.070347 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmzk\" (UniqueName: \"kubernetes.io/projected/e4005684-ff4d-4512-af85-83bfdf665451-kube-api-access-stmzk\") pod \"route-controller-manager-857c5b895-mzg7t\" (UID: \"e4005684-ff4d-4512-af85-83bfdf665451\") " pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.224431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.265454 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8" path="/var/lib/kubelet/pods/2c2c3a59-5dd8-4288-b7ac-47ff78abb6b8/volumes" Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.613196 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t"] Dec 01 10:36:49 crc kubenswrapper[4909]: I1201 10:36:49.734088 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" event={"ID":"e4005684-ff4d-4512-af85-83bfdf665451","Type":"ContainerStarted","Data":"e78a96ba7fe9386600ad5408f961f98247ed304fd89ad23fb39cbaec3c7d4af4"} Dec 01 10:36:50 crc kubenswrapper[4909]: I1201 10:36:50.741150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" event={"ID":"e4005684-ff4d-4512-af85-83bfdf665451","Type":"ContainerStarted","Data":"47cc311122edfc63029e470cf7b87f2564e6c6b6655ffb300e0671152d3831a2"} Dec 01 10:36:50 crc kubenswrapper[4909]: I1201 10:36:50.741528 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:50 crc kubenswrapper[4909]: I1201 10:36:50.746864 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" Dec 01 10:36:50 crc kubenswrapper[4909]: I1201 10:36:50.761668 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-857c5b895-mzg7t" podStartSLOduration=6.761647491 podStartE2EDuration="6.761647491s" podCreationTimestamp="2025-12-01 10:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:36:50.757540966 +0000 UTC m=+327.992011864" watchObservedRunningTime="2025-12-01 10:36:50.761647491 +0000 UTC m=+327.996118409" Dec 01 10:37:04 crc kubenswrapper[4909]: I1201 10:37:04.458676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cv9gs" Dec 01 10:37:04 crc kubenswrapper[4909]: I1201 10:37:04.522543 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:37:06 crc kubenswrapper[4909]: I1201 10:37:06.193685 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:37:06 crc kubenswrapper[4909]: I1201 10:37:06.193765 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.079903 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.081203 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" podUID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" containerName="controller-manager" containerID="cri-o://bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9" gracePeriod=30 Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.613370 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.630431 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca\") pod \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.630599 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles\") pod \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.630727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config\") pod \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.630749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert\") pod \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.630789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snlpf\" (UniqueName: \"kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf\") pod \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\" (UID: \"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb\") " Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.631906 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" (UID: "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.632736 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.632910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config" (OuterVolumeSpecName: "config") pod "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" (UID: "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.633825 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" (UID: "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.642144 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" (UID: "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.643128 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf" (OuterVolumeSpecName: "kube-api-access-snlpf") pod "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" (UID: "ec3a1d8a-3454-40c1-a396-aae5aadbb9bb"). InnerVolumeSpecName "kube-api-access-snlpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.733500 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.733549 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.733560 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.733572 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snlpf\" (UniqueName: \"kubernetes.io/projected/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb-kube-api-access-snlpf\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.926992 4909 generic.go:334] "Generic (PLEG): container finished" podID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" containerID="bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9" exitCode=0 Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.927059 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.927054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" event={"ID":"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb","Type":"ContainerDied","Data":"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9"} Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.927289 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b" event={"ID":"ec3a1d8a-3454-40c1-a396-aae5aadbb9bb","Type":"ContainerDied","Data":"4b0ea549c1ecc688a47a4d767e766b0e573d0fbb313a5bdedcaee20956e13721"} Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.927365 4909 scope.go:117] "RemoveContainer" containerID="bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.959258 4909 scope.go:117] "RemoveContainer" containerID="bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9" Dec 01 10:37:21 crc kubenswrapper[4909]: E1201 10:37:21.961288 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9\": container with ID starting with bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9 not found: ID does not exist" containerID="bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.961356 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9"} err="failed to get container status \"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9\": rpc error: code = NotFound desc = could not find container \"bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9\": container with ID starting with bd97c3932090eaa3196d2e5f0e91bb4fe166d464eba6cec7d04546070b7210d9 not found: ID does not exist" Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.961410 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:37:21 crc kubenswrapper[4909]: I1201 10:37:21.964930 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75c57f8b8b-nnt7b"] Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.935187 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-864bd886d9-c6zgc"] Dec 01 10:37:22 crc kubenswrapper[4909]: E1201 10:37:22.935417 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" containerName="controller-manager" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.935430 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" containerName="controller-manager" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.935540 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" containerName="controller-manager" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.936004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.938485 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.938486 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.938804 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.938940 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.939157 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.940523 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.949578 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-config\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.949620 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55035038-65ed-42ff-a643-2d1c92a0945b-serving-cert\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.949648 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-proxy-ca-bundles\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.949698 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgt9\" (UniqueName: \"kubernetes.io/projected/55035038-65ed-42ff-a643-2d1c92a0945b-kube-api-access-4fgt9\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.949866 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-client-ca\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.951485 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:37:22 crc kubenswrapper[4909]: I1201 10:37:22.954136 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-864bd886d9-c6zgc"] Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.050857 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-config\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.051152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55035038-65ed-42ff-a643-2d1c92a0945b-serving-cert\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.051263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-proxy-ca-bundles\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.051413 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgt9\" (UniqueName: \"kubernetes.io/projected/55035038-65ed-42ff-a643-2d1c92a0945b-kube-api-access-4fgt9\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.051521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-client-ca\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.052332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-proxy-ca-bundles\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.052485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-client-ca\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.052680 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55035038-65ed-42ff-a643-2d1c92a0945b-config\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.056286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55035038-65ed-42ff-a643-2d1c92a0945b-serving-cert\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.072360 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgt9\" (UniqueName: \"kubernetes.io/projected/55035038-65ed-42ff-a643-2d1c92a0945b-kube-api-access-4fgt9\") pod \"controller-manager-864bd886d9-c6zgc\" (UID: \"55035038-65ed-42ff-a643-2d1c92a0945b\") " pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.268149 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.272974 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3a1d8a-3454-40c1-a396-aae5aadbb9bb" path="/var/lib/kubelet/pods/ec3a1d8a-3454-40c1-a396-aae5aadbb9bb/volumes" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.274048 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.698977 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-864bd886d9-c6zgc"] Dec 01 10:37:23 crc kubenswrapper[4909]: W1201 10:37:23.705220 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55035038_65ed_42ff_a643_2d1c92a0945b.slice/crio-c142da48a0f6e21f049a4dc19a770b2cc3e075add5fbf02e543e804e1246b4f1 WatchSource:0}: Error finding container c142da48a0f6e21f049a4dc19a770b2cc3e075add5fbf02e543e804e1246b4f1: Status 404 returned error can't find the container with id c142da48a0f6e21f049a4dc19a770b2cc3e075add5fbf02e543e804e1246b4f1 Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.940390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" event={"ID":"55035038-65ed-42ff-a643-2d1c92a0945b","Type":"ContainerStarted","Data":"350488b1c1bfaab82b4ca68d3fde50cf2e6428670c922c8b8a2739d009610239"} Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.940448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" event={"ID":"55035038-65ed-42ff-a643-2d1c92a0945b","Type":"ContainerStarted","Data":"c142da48a0f6e21f049a4dc19a770b2cc3e075add5fbf02e543e804e1246b4f1"} Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.940738 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.947915 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" Dec 01 10:37:23 crc kubenswrapper[4909]: I1201 10:37:23.960690 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-864bd886d9-c6zgc" podStartSLOduration=2.960670748 podStartE2EDuration="2.960670748s" podCreationTimestamp="2025-12-01 10:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:37:23.957055807 +0000 UTC m=+361.191526735" watchObservedRunningTime="2025-12-01 10:37:23.960670748 +0000 UTC m=+361.195141646" Dec 01 10:37:29 crc kubenswrapper[4909]: I1201 10:37:29.563076 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" podUID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" containerName="registry" containerID="cri-o://bf61c6461e409536da1c43bd644a1703408990dba3e62fbf7e649dc4517d9bd5" gracePeriod=30 Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.002205 4909 generic.go:334] "Generic (PLEG): container finished" podID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" containerID="bf61c6461e409536da1c43bd644a1703408990dba3e62fbf7e649dc4517d9bd5" exitCode=0 Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.002294 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" event={"ID":"4ba162cc-ca36-4d6d-9034-7b3ad6f59179","Type":"ContainerDied","Data":"bf61c6461e409536da1c43bd644a1703408990dba3e62fbf7e649dc4517d9bd5"} Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.002369 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" event={"ID":"4ba162cc-ca36-4d6d-9034-7b3ad6f59179","Type":"ContainerDied","Data":"d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6"} Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.002386 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6367f632a1b9feab771c3b0c3a52431489b228e83dcb2576bac8690d7c575b6" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.031161 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.056516 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnmlf\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.056622 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.056678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.057009 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.057138 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.057224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.057320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.057399 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca\") pod \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\" (UID: \"4ba162cc-ca36-4d6d-9034-7b3ad6f59179\") " Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.059685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.061249 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.067037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf" (OuterVolumeSpecName: "kube-api-access-mnmlf") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "kube-api-access-mnmlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.071395 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.071504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.071836 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.079037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.088285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4ba162cc-ca36-4d6d-9034-7b3ad6f59179" (UID: "4ba162cc-ca36-4d6d-9034-7b3ad6f59179"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160836 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnmlf\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-kube-api-access-mnmlf\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160900 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160916 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160929 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160943 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160956 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:30 crc kubenswrapper[4909]: I1201 10:37:30.160970 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ba162cc-ca36-4d6d-9034-7b3ad6f59179-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:31 crc kubenswrapper[4909]: I1201 10:37:31.007505 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7mhb9" Dec 01 10:37:31 crc kubenswrapper[4909]: I1201 10:37:31.045051 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:37:31 crc kubenswrapper[4909]: I1201 10:37:31.045594 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7mhb9"] Dec 01 10:37:31 crc kubenswrapper[4909]: I1201 10:37:31.269159 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" path="/var/lib/kubelet/pods/4ba162cc-ca36-4d6d-9034-7b3ad6f59179/volumes" Dec 01 10:37:36 crc kubenswrapper[4909]: I1201 10:37:36.193687 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:37:36 crc kubenswrapper[4909]: I1201 10:37:36.194168 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:38:06 crc kubenswrapper[4909]: I1201 10:38:06.194143 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:38:06 crc kubenswrapper[4909]: I1201 10:38:06.195395 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:38:06 crc kubenswrapper[4909]: I1201 10:38:06.196450 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:38:06 crc kubenswrapper[4909]: I1201 10:38:06.197799 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:38:06 crc kubenswrapper[4909]: I1201 10:38:06.197973 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef" gracePeriod=600 Dec 01 10:38:07 crc kubenswrapper[4909]: I1201 10:38:07.238869 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef" exitCode=0 Dec 01 10:38:07 crc kubenswrapper[4909]: I1201 10:38:07.238973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef"} Dec 01 10:38:07 crc kubenswrapper[4909]: I1201 10:38:07.240398 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f"} Dec 01 10:38:07 crc kubenswrapper[4909]: I1201 10:38:07.240450 4909 scope.go:117] "RemoveContainer" containerID="068b242f2e1a1ea5171531eee2b567e105515eb907da8f3626dfad1cd2e1954d" Dec 01 10:40:06 crc kubenswrapper[4909]: I1201 10:40:06.193482 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:40:06 crc kubenswrapper[4909]: I1201 10:40:06.194075 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:40:23 crc kubenswrapper[4909]: I1201 10:40:23.444281 4909 scope.go:117] "RemoveContainer" containerID="bf61c6461e409536da1c43bd644a1703408990dba3e62fbf7e649dc4517d9bd5" Dec 01 10:40:36 crc kubenswrapper[4909]: I1201 10:40:36.193361 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:40:36 crc kubenswrapper[4909]: I1201 10:40:36.193993 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.193931 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.194588 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.194645 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.195299 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.195361 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f" gracePeriod=600 Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.343593 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f" exitCode=0 Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.343697 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f"} Dec 01 10:41:06 crc kubenswrapper[4909]: I1201 10:41:06.343796 4909 scope.go:117] "RemoveContainer" containerID="dbe2682cc5e6b212bd8e646f3120474003c4d5f01aacb8310079b39b95f1eaef" Dec 01 10:41:07 crc kubenswrapper[4909]: I1201 10:41:07.350659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3"} Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.694909 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vkfwh"] Dec 01 10:41:32 crc kubenswrapper[4909]: E1201 10:41:32.695741 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" containerName="registry" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.695757 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" containerName="registry" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.695856 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba162cc-ca36-4d6d-9034-7b3ad6f59179" containerName="registry" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.696316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.699476 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.699716 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-h6n7m" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.700653 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.715703 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vkfwh"] Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.727999 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-9hgvq"] Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.729010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-9hgvq" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.736217 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ml6bh" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.736423 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pfndl"] Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.737374 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.739013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6rw\" (UniqueName: \"kubernetes.io/projected/223fa06f-e48d-419c-848f-02792e3f9a17-kube-api-access-vh6rw\") pod \"cert-manager-5b446d88c5-9hgvq\" (UID: \"223fa06f-e48d-419c-848f-02792e3f9a17\") " pod="cert-manager/cert-manager-5b446d88c5-9hgvq" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.739074 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tnbz\" (UniqueName: \"kubernetes.io/projected/da64b9e1-fde2-45af-92aa-2a376d5afbcf-kube-api-access-7tnbz\") pod \"cert-manager-cainjector-7f985d654d-vkfwh\" (UID: \"da64b9e1-fde2-45af-92aa-2a376d5afbcf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.739095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnjx9\" (UniqueName: \"kubernetes.io/projected/ff0653be-4d31-4748-9729-1114a630b8fd-kube-api-access-qnjx9\") pod \"cert-manager-webhook-5655c58dd6-pfndl\" (UID: \"ff0653be-4d31-4748-9729-1114a630b8fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.739816 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rkvhd" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.757711 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-9hgvq"] Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.776336 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pfndl"] Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.841838 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6rw\" (UniqueName: \"kubernetes.io/projected/223fa06f-e48d-419c-848f-02792e3f9a17-kube-api-access-vh6rw\") pod \"cert-manager-5b446d88c5-9hgvq\" (UID: \"223fa06f-e48d-419c-848f-02792e3f9a17\") " pod="cert-manager/cert-manager-5b446d88c5-9hgvq" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.841960 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tnbz\" (UniqueName: \"kubernetes.io/projected/da64b9e1-fde2-45af-92aa-2a376d5afbcf-kube-api-access-7tnbz\") pod \"cert-manager-cainjector-7f985d654d-vkfwh\" (UID: \"da64b9e1-fde2-45af-92aa-2a376d5afbcf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.841991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnjx9\" (UniqueName: \"kubernetes.io/projected/ff0653be-4d31-4748-9729-1114a630b8fd-kube-api-access-qnjx9\") pod \"cert-manager-webhook-5655c58dd6-pfndl\" (UID: \"ff0653be-4d31-4748-9729-1114a630b8fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.862541 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnjx9\" (UniqueName: \"kubernetes.io/projected/ff0653be-4d31-4748-9729-1114a630b8fd-kube-api-access-qnjx9\") pod \"cert-manager-webhook-5655c58dd6-pfndl\" (UID: \"ff0653be-4d31-4748-9729-1114a630b8fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.862541 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tnbz\" (UniqueName: \"kubernetes.io/projected/da64b9e1-fde2-45af-92aa-2a376d5afbcf-kube-api-access-7tnbz\") pod \"cert-manager-cainjector-7f985d654d-vkfwh\" (UID: \"da64b9e1-fde2-45af-92aa-2a376d5afbcf\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" Dec 01 10:41:32 crc kubenswrapper[4909]: I1201 10:41:32.862651 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6rw\" (UniqueName: \"kubernetes.io/projected/223fa06f-e48d-419c-848f-02792e3f9a17-kube-api-access-vh6rw\") pod \"cert-manager-5b446d88c5-9hgvq\" (UID: \"223fa06f-e48d-419c-848f-02792e3f9a17\") " pod="cert-manager/cert-manager-5b446d88c5-9hgvq" Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.012549 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.045072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-9hgvq" Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.056524 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.376202 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pfndl"] Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.385277 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.444661 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vkfwh"] Dec 01 10:41:33 crc kubenswrapper[4909]: W1201 10:41:33.456849 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda64b9e1_fde2_45af_92aa_2a376d5afbcf.slice/crio-28d8ba02decc72d3b7321b76ac045585cf5d248c2d1da4cc1b705f3fc0d822ac WatchSource:0}: Error finding container 28d8ba02decc72d3b7321b76ac045585cf5d248c2d1da4cc1b705f3fc0d822ac: Status 404 returned error can't find the container with id 28d8ba02decc72d3b7321b76ac045585cf5d248c2d1da4cc1b705f3fc0d822ac Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.509768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" event={"ID":"ff0653be-4d31-4748-9729-1114a630b8fd","Type":"ContainerStarted","Data":"cfc739b926fea008797becd2a189ccdbe47835bbc9273ee11d027a3503535ff2"} Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.511284 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" event={"ID":"da64b9e1-fde2-45af-92aa-2a376d5afbcf","Type":"ContainerStarted","Data":"28d8ba02decc72d3b7321b76ac045585cf5d248c2d1da4cc1b705f3fc0d822ac"} Dec 01 10:41:33 crc kubenswrapper[4909]: I1201 10:41:33.521351 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-9hgvq"] Dec 01 10:41:33 crc kubenswrapper[4909]: W1201 10:41:33.524606 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod223fa06f_e48d_419c_848f_02792e3f9a17.slice/crio-6555e6716b4ee3a20d532bcacce266a83b1c56122f9d2ee0ecd8d8ec669dbf4b WatchSource:0}: Error finding container 6555e6716b4ee3a20d532bcacce266a83b1c56122f9d2ee0ecd8d8ec669dbf4b: Status 404 returned error can't find the container with id 6555e6716b4ee3a20d532bcacce266a83b1c56122f9d2ee0ecd8d8ec669dbf4b Dec 01 10:41:34 crc kubenswrapper[4909]: I1201 10:41:34.519859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-9hgvq" event={"ID":"223fa06f-e48d-419c-848f-02792e3f9a17","Type":"ContainerStarted","Data":"6555e6716b4ee3a20d532bcacce266a83b1c56122f9d2ee0ecd8d8ec669dbf4b"} Dec 01 10:41:37 crc kubenswrapper[4909]: I1201 10:41:37.542343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" event={"ID":"ff0653be-4d31-4748-9729-1114a630b8fd","Type":"ContainerStarted","Data":"54d4a151c1c82fce7aead98eeb776a7a1fa50f2e90d877418f3c060068bc459f"} Dec 01 10:41:37 crc kubenswrapper[4909]: I1201 10:41:37.542757 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:37 crc kubenswrapper[4909]: I1201 10:41:37.545892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" event={"ID":"da64b9e1-fde2-45af-92aa-2a376d5afbcf","Type":"ContainerStarted","Data":"8d7a38ef42127732117a24f91b1a8fa6e660ce75ebd9bcd30d93734d0fbd5c46"} Dec 01 10:41:37 crc kubenswrapper[4909]: I1201 10:41:37.562232 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" podStartSLOduration=1.8230613359999999 podStartE2EDuration="5.562212412s" podCreationTimestamp="2025-12-01 10:41:32 +0000 UTC" firstStartedPulling="2025-12-01 10:41:33.385003356 +0000 UTC m=+610.619474254" lastFinishedPulling="2025-12-01 10:41:37.124154432 +0000 UTC m=+614.358625330" observedRunningTime="2025-12-01 10:41:37.558278558 +0000 UTC m=+614.792749456" watchObservedRunningTime="2025-12-01 10:41:37.562212412 +0000 UTC m=+614.796683310" Dec 01 10:41:37 crc kubenswrapper[4909]: I1201 10:41:37.588305 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-vkfwh" podStartSLOduration=1.918497943 podStartE2EDuration="5.588274686s" podCreationTimestamp="2025-12-01 10:41:32 +0000 UTC" firstStartedPulling="2025-12-01 10:41:33.46041539 +0000 UTC m=+610.694886288" lastFinishedPulling="2025-12-01 10:41:37.130192133 +0000 UTC m=+614.364663031" observedRunningTime="2025-12-01 10:41:37.584656882 +0000 UTC m=+614.819127800" watchObservedRunningTime="2025-12-01 10:41:37.588274686 +0000 UTC m=+614.822745594" Dec 01 10:41:38 crc kubenswrapper[4909]: I1201 10:41:38.553279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-9hgvq" event={"ID":"223fa06f-e48d-419c-848f-02792e3f9a17","Type":"ContainerStarted","Data":"e410374ab8a575b0ea1f9dd795298ab70417f8abc0947b64e1aa7e82ec443c47"} Dec 01 10:41:38 crc kubenswrapper[4909]: I1201 10:41:38.568915 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-9hgvq" podStartSLOduration=2.570225797 podStartE2EDuration="6.56889589s" podCreationTimestamp="2025-12-01 10:41:32 +0000 UTC" firstStartedPulling="2025-12-01 10:41:33.530049211 +0000 UTC m=+610.764520109" lastFinishedPulling="2025-12-01 10:41:37.528719304 +0000 UTC m=+614.763190202" observedRunningTime="2025-12-01 10:41:38.566510104 +0000 UTC m=+615.800981012" watchObservedRunningTime="2025-12-01 10:41:38.56889589 +0000 UTC m=+615.803366798" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.060678 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-pfndl" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.200750 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5rks"] Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201428 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-controller" containerID="cri-o://68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201527 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201537 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="northd" containerID="cri-o://c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201677 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-node" containerID="cri-o://1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201568 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="sbdb" containerID="cri-o://b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201527 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="nbdb" containerID="cri-o://23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.201737 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-acl-logging" containerID="cri-o://e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.252426 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" containerID="cri-o://c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" gracePeriod=30 Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.423963 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 is running failed: container process not found" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.424009 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a is running failed: container process not found" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425406 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 is running failed: container process not found" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425417 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a is running failed: container process not found" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425801 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a is running failed: container process not found" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425828 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 is running failed: container process not found" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425853 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="nbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.425863 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="sbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.528115 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/3.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.531369 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovn-acl-logging/0.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.532302 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovn-controller/0.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.532779 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.598335 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/2.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.599807 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/1.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.600983 4909 generic.go:334] "Generic (PLEG): container finished" podID="89f06a94-5047-41d9-90a3-8433149d22c4" containerID="f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026" exitCode=2 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.601102 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerDied","Data":"f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.601157 4909 scope.go:117] "RemoveContainer" containerID="73cbec22bbb541e1899f2414143c3c295a3824da919403f4bf9d7a3d2f7e49a5" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.602729 4909 scope.go:117] "RemoveContainer" containerID="f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.603142 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-2qpdc_openshift-multus(89f06a94-5047-41d9-90a3-8433149d22c4)\"" pod="openshift-multus/multus-2qpdc" podUID="89f06a94-5047-41d9-90a3-8433149d22c4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.612571 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovnkube-controller/3.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.621802 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7549"] Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622105 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622128 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622138 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="northd" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622145 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="northd" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622159 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622168 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622178 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="nbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622185 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="nbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622197 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622204 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622216 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="sbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622224 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="sbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622234 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622241 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622253 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622263 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622272 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-acl-logging" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622279 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-acl-logging" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622289 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622296 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622311 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622318 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622329 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kubecfg-setup" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622337 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kubecfg-setup" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.622348 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-node" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622356 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-node" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622561 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622601 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622613 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622622 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622630 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="sbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622640 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="kube-rbac-proxy-node" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622652 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622659 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="nbdb" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622672 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovn-acl-logging" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.622729 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="northd" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.623060 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.623076 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerName="ovnkube-controller" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.624461 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovn-acl-logging/0.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.625234 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.625665 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5rks_57aeccf3-ec18-4a73-bd74-9b188de510ad/ovn-controller/0.log" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626291 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626345 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626365 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626380 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626395 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626408 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" exitCode=0 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626421 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" exitCode=143 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626434 4909 generic.go:334] "Generic (PLEG): container finished" podID="57aeccf3-ec18-4a73-bd74-9b188de510ad" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" exitCode=143 Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626475 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626629 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626677 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626700 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626713 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626726 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626740 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626750 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626764 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626787 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626798 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626809 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626824 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626842 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626854 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626862 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626870 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626903 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626913 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626927 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626938 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626965 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626977 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627006 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627015 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627023 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627032 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627042 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627050 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627059 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627065 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627073 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627081 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" event={"ID":"57aeccf3-ec18-4a73-bd74-9b188de510ad","Type":"ContainerDied","Data":"7eb314174062af013b837397c3d50d0a81173d8221ce9d2f03c041dc9b1c86c9"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627103 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627114 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627124 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627135 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627144 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627154 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627164 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627175 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627185 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.627195 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.626763 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5rks" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.658730 4909 scope.go:117] "RemoveContainer" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.685912 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.702545 4909 scope.go:117] "RemoveContainer" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.703946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704011 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704060 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2w8\" (UniqueName: \"kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704157 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704185 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704206 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704231 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704267 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704289 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704593 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704642 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704681 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704699 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704727 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704745 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib\") pod \"57aeccf3-ec18-4a73-bd74-9b188de510ad\" (UID: \"57aeccf3-ec18-4a73-bd74-9b188de510ad\") " Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704766 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.704844 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705299 4909 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705302 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705315 4909 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705357 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705370 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705384 4909 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705396 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705407 4909 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705418 4909 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705449 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705478 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705524 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket" (OuterVolumeSpecName: "log-socket") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash" (OuterVolumeSpecName: "host-slash") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705569 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705591 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.705615 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.706180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log" (OuterVolumeSpecName: "node-log") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.712373 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8" (OuterVolumeSpecName: "kube-api-access-xh2w8") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "kube-api-access-xh2w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.713039 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.720093 4909 scope.go:117] "RemoveContainer" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.723031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "57aeccf3-ec18-4a73-bd74-9b188de510ad" (UID: "57aeccf3-ec18-4a73-bd74-9b188de510ad"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.734579 4909 scope.go:117] "RemoveContainer" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.748757 4909 scope.go:117] "RemoveContainer" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.764153 4909 scope.go:117] "RemoveContainer" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.780503 4909 scope.go:117] "RemoveContainer" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.794514 4909 scope.go:117] "RemoveContainer" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.806791 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-var-lib-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.806840 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.806899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-script-lib\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807021 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-systemd-units\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-env-overrides\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807177 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-ovn\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807197 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-log-socket\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-config\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807273 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-netd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-slash\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-kubelet\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-netns\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-systemd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-bin\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tv5r\" (UniqueName: \"kubernetes.io/projected/ad0ee0d7-6697-42d2-90ed-62426b38ab57-kube-api-access-4tv5r\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807450 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-etc-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-node-log\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovn-node-metrics-cert\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807530 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807542 4909 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807555 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2w8\" (UniqueName: \"kubernetes.io/projected/57aeccf3-ec18-4a73-bd74-9b188de510ad-kube-api-access-xh2w8\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807564 4909 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807576 4909 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807588 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807602 4909 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807616 4909 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807629 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57aeccf3-ec18-4a73-bd74-9b188de510ad-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807640 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807652 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.807664 4909 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57aeccf3-ec18-4a73-bd74-9b188de510ad-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.809198 4909 scope.go:117] "RemoveContainer" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.829529 4909 scope.go:117] "RemoveContainer" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.830283 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": container with ID starting with c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f not found: ID does not exist" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.830348 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} err="failed to get container status \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": rpc error: code = NotFound desc = could not find container \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": container with ID starting with c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.830383 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.830825 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": container with ID starting with c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef not found: ID does not exist" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.830851 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} err="failed to get container status \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": rpc error: code = NotFound desc = could not find container \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": container with ID starting with c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.830896 4909 scope.go:117] "RemoveContainer" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.831162 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": container with ID starting with b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 not found: ID does not exist" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.831224 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} err="failed to get container status \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": rpc error: code = NotFound desc = could not find container \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": container with ID starting with b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.831244 4909 scope.go:117] "RemoveContainer" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.831618 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": container with ID starting with 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a not found: ID does not exist" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.831677 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} err="failed to get container status \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": rpc error: code = NotFound desc = could not find container \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": container with ID starting with 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.831729 4909 scope.go:117] "RemoveContainer" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.832135 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": container with ID starting with c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f not found: ID does not exist" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832183 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} err="failed to get container status \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": rpc error: code = NotFound desc = could not find container \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": container with ID starting with c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832200 4909 scope.go:117] "RemoveContainer" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.832509 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": container with ID starting with 60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4 not found: ID does not exist" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832547 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} err="failed to get container status \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": rpc error: code = NotFound desc = could not find container \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": container with ID starting with 60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832577 4909 scope.go:117] "RemoveContainer" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.832930 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": container with ID starting with 1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1 not found: ID does not exist" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832977 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} err="failed to get container status \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": rpc error: code = NotFound desc = could not find container \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": container with ID starting with 1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.832993 4909 scope.go:117] "RemoveContainer" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.833334 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": container with ID starting with e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1 not found: ID does not exist" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.833355 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} err="failed to get container status \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": rpc error: code = NotFound desc = could not find container \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": container with ID starting with e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.833370 4909 scope.go:117] "RemoveContainer" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.833704 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": container with ID starting with 68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209 not found: ID does not exist" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.833728 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} err="failed to get container status \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": rpc error: code = NotFound desc = could not find container \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": container with ID starting with 68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.833742 4909 scope.go:117] "RemoveContainer" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: E1201 10:41:43.834010 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": container with ID starting with 9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4 not found: ID does not exist" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834045 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} err="failed to get container status \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": rpc error: code = NotFound desc = could not find container \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": container with ID starting with 9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834070 4909 scope.go:117] "RemoveContainer" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834375 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} err="failed to get container status \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": rpc error: code = NotFound desc = could not find container \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": container with ID starting with c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834417 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834676 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} err="failed to get container status \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": rpc error: code = NotFound desc = could not find container \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": container with ID starting with c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.834699 4909 scope.go:117] "RemoveContainer" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835039 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} err="failed to get container status \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": rpc error: code = NotFound desc = could not find container \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": container with ID starting with b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835061 4909 scope.go:117] "RemoveContainer" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835313 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} err="failed to get container status \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": rpc error: code = NotFound desc = could not find container \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": container with ID starting with 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835342 4909 scope.go:117] "RemoveContainer" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835630 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} err="failed to get container status \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": rpc error: code = NotFound desc = could not find container \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": container with ID starting with c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.835700 4909 scope.go:117] "RemoveContainer" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836067 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} err="failed to get container status \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": rpc error: code = NotFound desc = could not find container \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": container with ID starting with 60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836101 4909 scope.go:117] "RemoveContainer" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836409 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} err="failed to get container status \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": rpc error: code = NotFound desc = could not find container \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": container with ID starting with 1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836435 4909 scope.go:117] "RemoveContainer" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836742 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} err="failed to get container status \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": rpc error: code = NotFound desc = could not find container \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": container with ID starting with e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.836767 4909 scope.go:117] "RemoveContainer" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837147 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} err="failed to get container status \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": rpc error: code = NotFound desc = could not find container \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": container with ID starting with 68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837183 4909 scope.go:117] "RemoveContainer" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837483 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} err="failed to get container status \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": rpc error: code = NotFound desc = could not find container \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": container with ID starting with 9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837506 4909 scope.go:117] "RemoveContainer" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837789 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} err="failed to get container status \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": rpc error: code = NotFound desc = could not find container \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": container with ID starting with c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.837819 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.838199 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} err="failed to get container status \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": rpc error: code = NotFound desc = could not find container \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": container with ID starting with c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.838235 4909 scope.go:117] "RemoveContainer" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.838581 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} err="failed to get container status \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": rpc error: code = NotFound desc = could not find container \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": container with ID starting with b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.838616 4909 scope.go:117] "RemoveContainer" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.839176 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} err="failed to get container status \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": rpc error: code = NotFound desc = could not find container \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": container with ID starting with 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.839229 4909 scope.go:117] "RemoveContainer" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.839571 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} err="failed to get container status \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": rpc error: code = NotFound desc = could not find container \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": container with ID starting with c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.839628 4909 scope.go:117] "RemoveContainer" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.840842 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} err="failed to get container status \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": rpc error: code = NotFound desc = could not find container \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": container with ID starting with 60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.840902 4909 scope.go:117] "RemoveContainer" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.841596 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} err="failed to get container status \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": rpc error: code = NotFound desc = could not find container \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": container with ID starting with 1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.841639 4909 scope.go:117] "RemoveContainer" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.842193 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} err="failed to get container status \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": rpc error: code = NotFound desc = could not find container \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": container with ID starting with e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.842245 4909 scope.go:117] "RemoveContainer" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.842738 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} err="failed to get container status \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": rpc error: code = NotFound desc = could not find container \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": container with ID starting with 68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.842785 4909 scope.go:117] "RemoveContainer" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.843299 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} err="failed to get container status \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": rpc error: code = NotFound desc = could not find container \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": container with ID starting with 9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.843343 4909 scope.go:117] "RemoveContainer" containerID="c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.843955 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f"} err="failed to get container status \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": rpc error: code = NotFound desc = could not find container \"c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f\": container with ID starting with c59ac933d2dbe451ac67db12998f877b86c1410537163e1a03830377236eb63f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.843998 4909 scope.go:117] "RemoveContainer" containerID="c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.844530 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef"} err="failed to get container status \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": rpc error: code = NotFound desc = could not find container \"c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef\": container with ID starting with c8a30e21545b9a4893b3acbbbdea3607fd28aa3d6f92a13516eca7d33c41f0ef not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.844579 4909 scope.go:117] "RemoveContainer" containerID="b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.845194 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314"} err="failed to get container status \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": rpc error: code = NotFound desc = could not find container \"b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314\": container with ID starting with b6d3b2b5012e8659304de487742f61dc82fbe7fb8c777495031d0c292a764314 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.845262 4909 scope.go:117] "RemoveContainer" containerID="23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.845751 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a"} err="failed to get container status \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": rpc error: code = NotFound desc = could not find container \"23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a\": container with ID starting with 23aed5f546da4bee28c0c1bb67d45195960ec19638d85ac2f71cb0985997e11a not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.845785 4909 scope.go:117] "RemoveContainer" containerID="c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846221 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f"} err="failed to get container status \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": rpc error: code = NotFound desc = could not find container \"c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f\": container with ID starting with c16873c366581007d96632e56a7dbfaa90ca1b3c1da245d08ab1c6522309207f not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846260 4909 scope.go:117] "RemoveContainer" containerID="60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846512 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4"} err="failed to get container status \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": rpc error: code = NotFound desc = could not find container \"60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4\": container with ID starting with 60f37ee02b463205182604df806774d11ea34d008e82e5e9325c0a9b4adabcd4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846536 4909 scope.go:117] "RemoveContainer" containerID="1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846844 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1"} err="failed to get container status \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": rpc error: code = NotFound desc = could not find container \"1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1\": container with ID starting with 1fea8ffc4a2d45580b3857e7ae0b11222f5d0fcedd78807d5fa065d0e3dd58e1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.846863 4909 scope.go:117] "RemoveContainer" containerID="e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.847121 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1"} err="failed to get container status \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": rpc error: code = NotFound desc = could not find container \"e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1\": container with ID starting with e8f82c10bcf9b50884b60f955cbae0b17ec70345abb54a9d08b1c1495cf64fa1 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.847144 4909 scope.go:117] "RemoveContainer" containerID="68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.847391 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209"} err="failed to get container status \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": rpc error: code = NotFound desc = could not find container \"68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209\": container with ID starting with 68ab2e57ec2e0241982b52ac349c433a1c5735a8a2a313bee782e2307521e209 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.847426 4909 scope.go:117] "RemoveContainer" containerID="9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.847663 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4"} err="failed to get container status \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": rpc error: code = NotFound desc = could not find container \"9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4\": container with ID starting with 9ed1934210b25403b6426eddedb185f72d200cbdbe98c043f8f6d7909e37f9a4 not found: ID does not exist" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909132 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-kubelet\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-netns\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-systemd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-bin\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tv5r\" (UniqueName: \"kubernetes.io/projected/ad0ee0d7-6697-42d2-90ed-62426b38ab57-kube-api-access-4tv5r\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-etc-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-node-log\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovn-node-metrics-cert\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-var-lib-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-script-lib\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-systemd-units\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909505 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-env-overrides\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-ovn\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-log-socket\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909609 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-config\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909635 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-netd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.909683 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-slash\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.910118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-kubelet\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.910173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-netns\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.910206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-systemd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.910235 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-bin\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.910966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-etc-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.911007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-node-log\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.912033 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.912151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-var-lib-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.912181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-openvswitch\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.912980 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-systemd-units\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913188 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-script-lib\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovnkube-config\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913657 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-run-ovn\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913680 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-log-socket\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-cni-netd\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913724 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad0ee0d7-6697-42d2-90ed-62426b38ab57-host-slash\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.913856 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad0ee0d7-6697-42d2-90ed-62426b38ab57-env-overrides\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.915412 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad0ee0d7-6697-42d2-90ed-62426b38ab57-ovn-node-metrics-cert\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.935394 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tv5r\" (UniqueName: \"kubernetes.io/projected/ad0ee0d7-6697-42d2-90ed-62426b38ab57-kube-api-access-4tv5r\") pod \"ovnkube-node-t7549\" (UID: \"ad0ee0d7-6697-42d2-90ed-62426b38ab57\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.955818 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.969623 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5rks"] Dec 01 10:41:43 crc kubenswrapper[4909]: I1201 10:41:43.974018 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5rks"] Dec 01 10:41:44 crc kubenswrapper[4909]: I1201 10:41:44.637369 4909 generic.go:334] "Generic (PLEG): container finished" podID="ad0ee0d7-6697-42d2-90ed-62426b38ab57" containerID="559b7ec7fc902b1e7a58d535bd688cf07a6b4995ac5d3a305c5f6382e9f5a40b" exitCode=0 Dec 01 10:41:44 crc kubenswrapper[4909]: I1201 10:41:44.637475 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerDied","Data":"559b7ec7fc902b1e7a58d535bd688cf07a6b4995ac5d3a305c5f6382e9f5a40b"} Dec 01 10:41:44 crc kubenswrapper[4909]: I1201 10:41:44.638035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"0a8c0310e94ae3bdecc3cade70ec06086c9db5e0090f7d3f3c233aa84a0d7ca4"} Dec 01 10:41:44 crc kubenswrapper[4909]: I1201 10:41:44.640157 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/2.log" Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.264228 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57aeccf3-ec18-4a73-bd74-9b188de510ad" path="/var/lib/kubelet/pods/57aeccf3-ec18-4a73-bd74-9b188de510ad/volumes" Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651097 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"247e2c133a67b56e692efdf89e20d40d61130634de1a809ec2044eb37e643bc4"} Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651143 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"8f87d4faee05ecbba520e3416b7735192bd4998eea05c76a050ebba5e9748930"} Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"dd32052ee630567c738514ce1b072637ff6902f1f3e84e4321d269e576f9bd57"} Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"e7b38b03f3bafc00ed20580a03a16b970f3e9089d2e392b1955a854854aa60fe"} Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"009c640298923d342d45d891f58dfbe9c2a4527c58e92a55e67d67c618c29b28"} Dec 01 10:41:45 crc kubenswrapper[4909]: I1201 10:41:45.651182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"42cc5f75827d3fa5104f7e0c537bfa558f154646162ca3c85dfd18ceed687cf8"} Dec 01 10:41:47 crc kubenswrapper[4909]: I1201 10:41:47.668192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"54cd11005a82715e94ac9472f8efb561a819d30cfb38c14f095a09a45b8260ae"} Dec 01 10:41:50 crc kubenswrapper[4909]: I1201 10:41:50.690329 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" event={"ID":"ad0ee0d7-6697-42d2-90ed-62426b38ab57","Type":"ContainerStarted","Data":"a5385d5dd4ecb70fc69e25c3a8bb53c3e7fc48cd6283ce81489ec729bef80635"} Dec 01 10:41:50 crc kubenswrapper[4909]: I1201 10:41:50.690890 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:50 crc kubenswrapper[4909]: I1201 10:41:50.690908 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:50 crc kubenswrapper[4909]: I1201 10:41:50.720307 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" podStartSLOduration=7.72028953 podStartE2EDuration="7.72028953s" podCreationTimestamp="2025-12-01 10:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:41:50.718284886 +0000 UTC m=+627.952755804" watchObservedRunningTime="2025-12-01 10:41:50.72028953 +0000 UTC m=+627.954760448" Dec 01 10:41:50 crc kubenswrapper[4909]: I1201 10:41:50.762444 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:51 crc kubenswrapper[4909]: I1201 10:41:51.695421 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:51 crc kubenswrapper[4909]: I1201 10:41:51.718649 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:41:58 crc kubenswrapper[4909]: I1201 10:41:58.257836 4909 scope.go:117] "RemoveContainer" containerID="f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026" Dec 01 10:41:58 crc kubenswrapper[4909]: E1201 10:41:58.258607 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-2qpdc_openshift-multus(89f06a94-5047-41d9-90a3-8433149d22c4)\"" pod="openshift-multus/multus-2qpdc" podUID="89f06a94-5047-41d9-90a3-8433149d22c4" Dec 01 10:42:12 crc kubenswrapper[4909]: I1201 10:42:12.257422 4909 scope.go:117] "RemoveContainer" containerID="f87f912fdd49fda2a27ad7e25c8a792af8b5c9e78f06e76d346d060137e87026" Dec 01 10:42:12 crc kubenswrapper[4909]: I1201 10:42:12.828782 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qpdc_89f06a94-5047-41d9-90a3-8433149d22c4/kube-multus/2.log" Dec 01 10:42:12 crc kubenswrapper[4909]: I1201 10:42:12.829404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qpdc" event={"ID":"89f06a94-5047-41d9-90a3-8433149d22c4","Type":"ContainerStarted","Data":"816ebf476c7f94a2a65993961bbe708ac92688771fa974242a2efe2d53acb249"} Dec 01 10:42:13 crc kubenswrapper[4909]: I1201 10:42:13.978983 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7549" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.442509 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz"] Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.444839 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.450883 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.453612 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz"] Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.603947 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7k2h\" (UniqueName: \"kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.604507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.604535 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.706417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7k2h\" (UniqueName: \"kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.706507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.706558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.707676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.707869 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.745348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7k2h\" (UniqueName: \"kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.761127 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:23 crc kubenswrapper[4909]: I1201 10:42:23.979373 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz"] Dec 01 10:42:23 crc kubenswrapper[4909]: W1201 10:42:23.994146 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1168e695_833f_4350_9091_31cf52abecb7.slice/crio-01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939 WatchSource:0}: Error finding container 01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939: Status 404 returned error can't find the container with id 01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939 Dec 01 10:42:24 crc kubenswrapper[4909]: I1201 10:42:24.899491 4909 generic.go:334] "Generic (PLEG): container finished" podID="1168e695-833f-4350-9091-31cf52abecb7" containerID="e75d1f03e89f945f21c85b1a67fe94f607202a094b2c41832f8a00f4383449a7" exitCode=0 Dec 01 10:42:24 crc kubenswrapper[4909]: I1201 10:42:24.899548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" event={"ID":"1168e695-833f-4350-9091-31cf52abecb7","Type":"ContainerDied","Data":"e75d1f03e89f945f21c85b1a67fe94f607202a094b2c41832f8a00f4383449a7"} Dec 01 10:42:24 crc kubenswrapper[4909]: I1201 10:42:24.899580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" event={"ID":"1168e695-833f-4350-9091-31cf52abecb7","Type":"ContainerStarted","Data":"01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939"} Dec 01 10:42:26 crc kubenswrapper[4909]: I1201 10:42:26.910952 4909 generic.go:334] "Generic (PLEG): container finished" podID="1168e695-833f-4350-9091-31cf52abecb7" containerID="965b34031564712b710d49f9302d20fbe113d7f2cf964f89dc4fc957a6ed54dd" exitCode=0 Dec 01 10:42:26 crc kubenswrapper[4909]: I1201 10:42:26.911067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" event={"ID":"1168e695-833f-4350-9091-31cf52abecb7","Type":"ContainerDied","Data":"965b34031564712b710d49f9302d20fbe113d7f2cf964f89dc4fc957a6ed54dd"} Dec 01 10:42:27 crc kubenswrapper[4909]: I1201 10:42:27.919838 4909 generic.go:334] "Generic (PLEG): container finished" podID="1168e695-833f-4350-9091-31cf52abecb7" containerID="87bb4e1d68aca8578ac35c018ac2814b4053ed6e8020e2faf95b1ac17380a0d4" exitCode=0 Dec 01 10:42:27 crc kubenswrapper[4909]: I1201 10:42:27.919923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" event={"ID":"1168e695-833f-4350-9091-31cf52abecb7","Type":"ContainerDied","Data":"87bb4e1d68aca8578ac35c018ac2814b4053ed6e8020e2faf95b1ac17380a0d4"} Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.178914 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.282955 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util\") pod \"1168e695-833f-4350-9091-31cf52abecb7\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.283237 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle\") pod \"1168e695-833f-4350-9091-31cf52abecb7\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.283394 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7k2h\" (UniqueName: \"kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h\") pod \"1168e695-833f-4350-9091-31cf52abecb7\" (UID: \"1168e695-833f-4350-9091-31cf52abecb7\") " Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.283969 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle" (OuterVolumeSpecName: "bundle") pod "1168e695-833f-4350-9091-31cf52abecb7" (UID: "1168e695-833f-4350-9091-31cf52abecb7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.291014 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h" (OuterVolumeSpecName: "kube-api-access-l7k2h") pod "1168e695-833f-4350-9091-31cf52abecb7" (UID: "1168e695-833f-4350-9091-31cf52abecb7"). InnerVolumeSpecName "kube-api-access-l7k2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.296532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util" (OuterVolumeSpecName: "util") pod "1168e695-833f-4350-9091-31cf52abecb7" (UID: "1168e695-833f-4350-9091-31cf52abecb7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.385618 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.386129 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1168e695-833f-4350-9091-31cf52abecb7-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.386138 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7k2h\" (UniqueName: \"kubernetes.io/projected/1168e695-833f-4350-9091-31cf52abecb7-kube-api-access-l7k2h\") on node \"crc\" DevicePath \"\"" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.932680 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" event={"ID":"1168e695-833f-4350-9091-31cf52abecb7","Type":"ContainerDied","Data":"01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939"} Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.932729 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01decb721cdfb409716c83854a4d78888e365843bdbd04a8471157f59d7d4939" Dec 01 10:42:29 crc kubenswrapper[4909]: I1201 10:42:29.932743 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.070267 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd"] Dec 01 10:42:31 crc kubenswrapper[4909]: E1201 10:42:31.070541 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="extract" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.070556 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="extract" Dec 01 10:42:31 crc kubenswrapper[4909]: E1201 10:42:31.070565 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="util" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.070571 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="util" Dec 01 10:42:31 crc kubenswrapper[4909]: E1201 10:42:31.070589 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="pull" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.070595 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="pull" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.070703 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1168e695-833f-4350-9091-31cf52abecb7" containerName="extract" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.071170 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.076043 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.076591 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.077542 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tt2mm" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.128916 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd"] Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.211465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cmx\" (UniqueName: \"kubernetes.io/projected/3e41ca35-79dd-414a-bfb0-d98fc06ea9ab-kube-api-access-s5cmx\") pod \"nmstate-operator-5b5b58f5c8-7dgvd\" (UID: \"3e41ca35-79dd-414a-bfb0-d98fc06ea9ab\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.312503 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cmx\" (UniqueName: \"kubernetes.io/projected/3e41ca35-79dd-414a-bfb0-d98fc06ea9ab-kube-api-access-s5cmx\") pod \"nmstate-operator-5b5b58f5c8-7dgvd\" (UID: \"3e41ca35-79dd-414a-bfb0-d98fc06ea9ab\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.336719 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cmx\" (UniqueName: \"kubernetes.io/projected/3e41ca35-79dd-414a-bfb0-d98fc06ea9ab-kube-api-access-s5cmx\") pod \"nmstate-operator-5b5b58f5c8-7dgvd\" (UID: \"3e41ca35-79dd-414a-bfb0-d98fc06ea9ab\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.394018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.623903 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd"] Dec 01 10:42:31 crc kubenswrapper[4909]: W1201 10:42:31.637152 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e41ca35_79dd_414a_bfb0_d98fc06ea9ab.slice/crio-86ab992a36d5cb528fa99164c3e7596ef4516b1be935ef7080b827eb6c01a24f WatchSource:0}: Error finding container 86ab992a36d5cb528fa99164c3e7596ef4516b1be935ef7080b827eb6c01a24f: Status 404 returned error can't find the container with id 86ab992a36d5cb528fa99164c3e7596ef4516b1be935ef7080b827eb6c01a24f Dec 01 10:42:31 crc kubenswrapper[4909]: I1201 10:42:31.957319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" event={"ID":"3e41ca35-79dd-414a-bfb0-d98fc06ea9ab","Type":"ContainerStarted","Data":"86ab992a36d5cb528fa99164c3e7596ef4516b1be935ef7080b827eb6c01a24f"} Dec 01 10:42:34 crc kubenswrapper[4909]: I1201 10:42:34.980026 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" event={"ID":"3e41ca35-79dd-414a-bfb0-d98fc06ea9ab","Type":"ContainerStarted","Data":"821c563bd6b4a350eceb622877249969a32a57e65937728b6fb1f7860e9c26cc"} Dec 01 10:42:35 crc kubenswrapper[4909]: I1201 10:42:35.012127 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-7dgvd" podStartSLOduration=1.656252584 podStartE2EDuration="4.012086023s" podCreationTimestamp="2025-12-01 10:42:31 +0000 UTC" firstStartedPulling="2025-12-01 10:42:31.641934936 +0000 UTC m=+668.876405834" lastFinishedPulling="2025-12-01 10:42:33.997768375 +0000 UTC m=+671.232239273" observedRunningTime="2025-12-01 10:42:35.006594446 +0000 UTC m=+672.241065344" watchObservedRunningTime="2025-12-01 10:42:35.012086023 +0000 UTC m=+672.246556921" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.047065 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.048424 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.052476 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2szhp" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.059811 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.072717 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.073647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.076729 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.093893 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.097905 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rg5bv"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.098769 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.185175 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-nmstate-lock\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.185258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjm9\" (UniqueName: \"kubernetes.io/projected/3984210c-0ef8-41fb-b086-fc47d00b1101-kube-api-access-hdjm9\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.185414 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-dbus-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.185909 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-ovs-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.185990 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.186016 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4j4\" (UniqueName: \"kubernetes.io/projected/dd3296b5-d517-4d8e-8572-107b49b7d78b-kube-api-access-bw4j4\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.186128 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwbfc\" (UniqueName: \"kubernetes.io/projected/d40ebf8e-cd42-4946-ad61-06e5d3e254da-kube-api-access-vwbfc\") pod \"nmstate-metrics-7f946cbc9-s6vqh\" (UID: \"d40ebf8e-cd42-4946-ad61-06e5d3e254da\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.202802 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.203607 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.207509 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.207968 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.207984 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fq47c" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.212920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwbfc\" (UniqueName: \"kubernetes.io/projected/d40ebf8e-cd42-4946-ad61-06e5d3e254da-kube-api-access-vwbfc\") pod \"nmstate-metrics-7f946cbc9-s6vqh\" (UID: \"d40ebf8e-cd42-4946-ad61-06e5d3e254da\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287765 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-nmstate-lock\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjm9\" (UniqueName: \"kubernetes.io/projected/3984210c-0ef8-41fb-b086-fc47d00b1101-kube-api-access-hdjm9\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287826 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-dbus-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-ovs-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287894 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.287910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4j4\" (UniqueName: \"kubernetes.io/projected/dd3296b5-d517-4d8e-8572-107b49b7d78b-kube-api-access-bw4j4\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.288174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-nmstate-lock\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.288192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-ovs-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: E1201 10:42:36.288283 4909 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 10:42:36 crc kubenswrapper[4909]: E1201 10:42:36.288443 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair podName:dd3296b5-d517-4d8e-8572-107b49b7d78b nodeName:}" failed. No retries permitted until 2025-12-01 10:42:36.788423361 +0000 UTC m=+674.022894259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-78fh2" (UID: "dd3296b5-d517-4d8e-8572-107b49b7d78b") : secret "openshift-nmstate-webhook" not found Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.307235 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3984210c-0ef8-41fb-b086-fc47d00b1101-dbus-socket\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.325538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4j4\" (UniqueName: \"kubernetes.io/projected/dd3296b5-d517-4d8e-8572-107b49b7d78b-kube-api-access-bw4j4\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.328156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwbfc\" (UniqueName: \"kubernetes.io/projected/d40ebf8e-cd42-4946-ad61-06e5d3e254da-kube-api-access-vwbfc\") pod \"nmstate-metrics-7f946cbc9-s6vqh\" (UID: \"d40ebf8e-cd42-4946-ad61-06e5d3e254da\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.340182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjm9\" (UniqueName: \"kubernetes.io/projected/3984210c-0ef8-41fb-b086-fc47d00b1101-kube-api-access-hdjm9\") pod \"nmstate-handler-rg5bv\" (UID: \"3984210c-0ef8-41fb-b086-fc47d00b1101\") " pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.372165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.389056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c33158af-8ace-44c3-bb11-587dc452768e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.389140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c33158af-8ace-44c3-bb11-587dc452768e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.389174 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqfnt\" (UniqueName: \"kubernetes.io/projected/c33158af-8ace-44c3-bb11-587dc452768e-kube-api-access-sqfnt\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.419888 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.443791 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-769cdcbc88-87c6h"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.444663 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: W1201 10:42:36.459996 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3984210c_0ef8_41fb_b086_fc47d00b1101.slice/crio-1433b0ca88c78014b6af620efaf5d9237fa6ec18b28a3246d4db4a6755830b57 WatchSource:0}: Error finding container 1433b0ca88c78014b6af620efaf5d9237fa6ec18b28a3246d4db4a6755830b57: Status 404 returned error can't find the container with id 1433b0ca88c78014b6af620efaf5d9237fa6ec18b28a3246d4db4a6755830b57 Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.490749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c33158af-8ace-44c3-bb11-587dc452768e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.490828 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c33158af-8ace-44c3-bb11-587dc452768e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.490938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqfnt\" (UniqueName: \"kubernetes.io/projected/c33158af-8ace-44c3-bb11-587dc452768e-kube-api-access-sqfnt\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.491855 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c33158af-8ace-44c3-bb11-587dc452768e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.497087 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c33158af-8ace-44c3-bb11-587dc452768e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.505574 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769cdcbc88-87c6h"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.512648 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqfnt\" (UniqueName: \"kubernetes.io/projected/c33158af-8ace-44c3-bb11-587dc452768e-kube-api-access-sqfnt\") pod \"nmstate-console-plugin-7fbb5f6569-kbcdt\" (UID: \"c33158af-8ace-44c3-bb11-587dc452768e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.519414 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592550 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-oauth-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whpr\" (UniqueName: \"kubernetes.io/projected/525075d8-4494-4987-ab64-55988035b4b0-kube-api-access-8whpr\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-service-ca\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592806 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-oauth-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592906 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-console-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.592950 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.593001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-trusted-ca-bundle\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.622242 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh"] Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694070 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-service-ca\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-oauth-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-console-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694244 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-trusted-ca-bundle\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694412 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-oauth-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.694437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whpr\" (UniqueName: \"kubernetes.io/projected/525075d8-4494-4987-ab64-55988035b4b0-kube-api-access-8whpr\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.695058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-service-ca\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.695847 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-console-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.696497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-oauth-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.697946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/525075d8-4494-4987-ab64-55988035b4b0-trusted-ca-bundle\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.698336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-oauth-config\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.698863 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/525075d8-4494-4987-ab64-55988035b4b0-console-serving-cert\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.713329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whpr\" (UniqueName: \"kubernetes.io/projected/525075d8-4494-4987-ab64-55988035b4b0-kube-api-access-8whpr\") pod \"console-769cdcbc88-87c6h\" (UID: \"525075d8-4494-4987-ab64-55988035b4b0\") " pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.726896 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt"] Dec 01 10:42:36 crc kubenswrapper[4909]: W1201 10:42:36.731397 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc33158af_8ace_44c3_bb11_587dc452768e.slice/crio-aab1aebb6c70b891bbf96c091df88782656340ebbf81c86ef87b7bbae783c22d WatchSource:0}: Error finding container aab1aebb6c70b891bbf96c091df88782656340ebbf81c86ef87b7bbae783c22d: Status 404 returned error can't find the container with id aab1aebb6c70b891bbf96c091df88782656340ebbf81c86ef87b7bbae783c22d Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.777303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.799262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.805711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dd3296b5-d517-4d8e-8572-107b49b7d78b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78fh2\" (UID: \"dd3296b5-d517-4d8e-8572-107b49b7d78b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.964014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769cdcbc88-87c6h"] Dec 01 10:42:36 crc kubenswrapper[4909]: W1201 10:42:36.973765 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525075d8_4494_4987_ab64_55988035b4b0.slice/crio-5eda7411b9d976f32007874930b064457ca360ccccf907f93afbdec1cd90a0c1 WatchSource:0}: Error finding container 5eda7411b9d976f32007874930b064457ca360ccccf907f93afbdec1cd90a0c1: Status 404 returned error can't find the container with id 5eda7411b9d976f32007874930b064457ca360ccccf907f93afbdec1cd90a0c1 Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.992303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.994130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rg5bv" event={"ID":"3984210c-0ef8-41fb-b086-fc47d00b1101","Type":"ContainerStarted","Data":"1433b0ca88c78014b6af620efaf5d9237fa6ec18b28a3246d4db4a6755830b57"} Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.995077 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" event={"ID":"d40ebf8e-cd42-4946-ad61-06e5d3e254da","Type":"ContainerStarted","Data":"f3a39a93dc15a1b79139e7700220e5fd60b04c44e79e6f1bef3cc290791f4020"} Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.996180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" event={"ID":"c33158af-8ace-44c3-bb11-587dc452768e","Type":"ContainerStarted","Data":"aab1aebb6c70b891bbf96c091df88782656340ebbf81c86ef87b7bbae783c22d"} Dec 01 10:42:36 crc kubenswrapper[4909]: I1201 10:42:36.997570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cdcbc88-87c6h" event={"ID":"525075d8-4494-4987-ab64-55988035b4b0","Type":"ContainerStarted","Data":"5eda7411b9d976f32007874930b064457ca360ccccf907f93afbdec1cd90a0c1"} Dec 01 10:42:37 crc kubenswrapper[4909]: I1201 10:42:37.188562 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2"] Dec 01 10:42:37 crc kubenswrapper[4909]: W1201 10:42:37.193292 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3296b5_d517_4d8e_8572_107b49b7d78b.slice/crio-b6de6b947f0d51e5153fc7205793d46596d23b1a00de797d4b98743787dc6fc6 WatchSource:0}: Error finding container b6de6b947f0d51e5153fc7205793d46596d23b1a00de797d4b98743787dc6fc6: Status 404 returned error can't find the container with id b6de6b947f0d51e5153fc7205793d46596d23b1a00de797d4b98743787dc6fc6 Dec 01 10:42:38 crc kubenswrapper[4909]: I1201 10:42:38.015647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cdcbc88-87c6h" event={"ID":"525075d8-4494-4987-ab64-55988035b4b0","Type":"ContainerStarted","Data":"6dfcd66dfa982a0184406a58efef82f5fc3aa0f5e6099064162d259fc44be682"} Dec 01 10:42:38 crc kubenswrapper[4909]: I1201 10:42:38.023033 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" event={"ID":"dd3296b5-d517-4d8e-8572-107b49b7d78b","Type":"ContainerStarted","Data":"b6de6b947f0d51e5153fc7205793d46596d23b1a00de797d4b98743787dc6fc6"} Dec 01 10:42:38 crc kubenswrapper[4909]: I1201 10:42:38.047442 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-769cdcbc88-87c6h" podStartSLOduration=2.047391921 podStartE2EDuration="2.047391921s" podCreationTimestamp="2025-12-01 10:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:42:38.043368139 +0000 UTC m=+675.277839037" watchObservedRunningTime="2025-12-01 10:42:38.047391921 +0000 UTC m=+675.281862819" Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.038575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" event={"ID":"d40ebf8e-cd42-4946-ad61-06e5d3e254da","Type":"ContainerStarted","Data":"aaae6f602567ed9dc5a0ea3e7de9b61f2827ec30c02417ef906e2c06055b5d4c"} Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.040586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rg5bv" event={"ID":"3984210c-0ef8-41fb-b086-fc47d00b1101","Type":"ContainerStarted","Data":"2ed6aa011974e1a003d3736f5a84470b9c900b330293726a851f5ab959480fa4"} Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.040685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.044034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" event={"ID":"dd3296b5-d517-4d8e-8572-107b49b7d78b","Type":"ContainerStarted","Data":"85def73698bb3729d842e2e323c71c1ded980fc2cff453d6d062966d34e0ac41"} Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.044295 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.060888 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rg5bv" podStartSLOduration=1.069803541 podStartE2EDuration="4.060842694s" podCreationTimestamp="2025-12-01 10:42:36 +0000 UTC" firstStartedPulling="2025-12-01 10:42:36.476934125 +0000 UTC m=+673.711405023" lastFinishedPulling="2025-12-01 10:42:39.467973278 +0000 UTC m=+676.702444176" observedRunningTime="2025-12-01 10:42:40.060275366 +0000 UTC m=+677.294746274" watchObservedRunningTime="2025-12-01 10:42:40.060842694 +0000 UTC m=+677.295313592" Dec 01 10:42:40 crc kubenswrapper[4909]: I1201 10:42:40.106456 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" podStartSLOduration=1.863396103 podStartE2EDuration="4.106433061s" podCreationTimestamp="2025-12-01 10:42:36 +0000 UTC" firstStartedPulling="2025-12-01 10:42:37.196619149 +0000 UTC m=+674.431090047" lastFinishedPulling="2025-12-01 10:42:39.439656107 +0000 UTC m=+676.674127005" observedRunningTime="2025-12-01 10:42:40.105127771 +0000 UTC m=+677.339598689" watchObservedRunningTime="2025-12-01 10:42:40.106433061 +0000 UTC m=+677.340903959" Dec 01 10:42:41 crc kubenswrapper[4909]: I1201 10:42:41.059454 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" event={"ID":"c33158af-8ace-44c3-bb11-587dc452768e","Type":"ContainerStarted","Data":"75f8e41a359b4752ec621fe7343e1d2a885cb17d73224ebe8a38a9668eb672fb"} Dec 01 10:42:41 crc kubenswrapper[4909]: I1201 10:42:41.080451 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kbcdt" podStartSLOduration=1.025233316 podStartE2EDuration="5.080421251s" podCreationTimestamp="2025-12-01 10:42:36 +0000 UTC" firstStartedPulling="2025-12-01 10:42:36.734075139 +0000 UTC m=+673.968546037" lastFinishedPulling="2025-12-01 10:42:40.789263074 +0000 UTC m=+678.023733972" observedRunningTime="2025-12-01 10:42:41.07806186 +0000 UTC m=+678.312532778" watchObservedRunningTime="2025-12-01 10:42:41.080421251 +0000 UTC m=+678.314892159" Dec 01 10:42:43 crc kubenswrapper[4909]: I1201 10:42:43.080418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" event={"ID":"d40ebf8e-cd42-4946-ad61-06e5d3e254da","Type":"ContainerStarted","Data":"149edb43d1fb0dcc78a50ebc0b6c0d48491fde2ccf95153d97090b217c29df7c"} Dec 01 10:42:43 crc kubenswrapper[4909]: I1201 10:42:43.105937 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-s6vqh" podStartSLOduration=1.461391705 podStartE2EDuration="7.105908581s" podCreationTimestamp="2025-12-01 10:42:36 +0000 UTC" firstStartedPulling="2025-12-01 10:42:36.631687383 +0000 UTC m=+673.866158281" lastFinishedPulling="2025-12-01 10:42:42.276204259 +0000 UTC m=+679.510675157" observedRunningTime="2025-12-01 10:42:43.10329391 +0000 UTC m=+680.337764818" watchObservedRunningTime="2025-12-01 10:42:43.105908581 +0000 UTC m=+680.340379479" Dec 01 10:42:46 crc kubenswrapper[4909]: I1201 10:42:46.450425 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rg5bv" Dec 01 10:42:46 crc kubenswrapper[4909]: I1201 10:42:46.777598 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:46 crc kubenswrapper[4909]: I1201 10:42:46.777695 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:46 crc kubenswrapper[4909]: I1201 10:42:46.785226 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:47 crc kubenswrapper[4909]: I1201 10:42:47.109990 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-769cdcbc88-87c6h" Dec 01 10:42:47 crc kubenswrapper[4909]: I1201 10:42:47.169981 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:42:57 crc kubenswrapper[4909]: I1201 10:42:57.000159 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78fh2" Dec 01 10:43:06 crc kubenswrapper[4909]: I1201 10:43:06.194056 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:43:06 crc kubenswrapper[4909]: I1201 10:43:06.195074 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.803431 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7"] Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.804957 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.807377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.815677 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7"] Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.832056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.832272 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.832426 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmdq\" (UniqueName: \"kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.933979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.934208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.934350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmdq\" (UniqueName: \"kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.934542 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.934601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:10 crc kubenswrapper[4909]: I1201 10:43:10.951849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmdq\" (UniqueName: \"kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:11 crc kubenswrapper[4909]: I1201 10:43:11.123026 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:11 crc kubenswrapper[4909]: I1201 10:43:11.520522 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7"] Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.232656 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wfgm2" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerName="console" containerID="cri-o://2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c" gracePeriod=15 Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.270853 4909 generic.go:334] "Generic (PLEG): container finished" podID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerID="472557fd33e7eb7150a047f29bcfd3cee35afff15ed7521be0b0baa85b8dc83c" exitCode=0 Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.270961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" event={"ID":"a1beb94c-1e21-4dd7-814e-c345e45dc803","Type":"ContainerDied","Data":"472557fd33e7eb7150a047f29bcfd3cee35afff15ed7521be0b0baa85b8dc83c"} Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.271050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" event={"ID":"a1beb94c-1e21-4dd7-814e-c345e45dc803","Type":"ContainerStarted","Data":"cdf7bf0eec6b9282174e3972c489cf9ef0a3bf0c279160d7cfc397461792c53a"} Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.665427 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wfgm2_a366b491-4c3c-40a9-86a0-a82d686b1a15/console/0.log" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.665960 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764249 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764315 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764374 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85wd6\" (UniqueName: \"kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764397 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764419 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764437 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.764488 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config\") pod \"a366b491-4c3c-40a9-86a0-a82d686b1a15\" (UID: \"a366b491-4c3c-40a9-86a0-a82d686b1a15\") " Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.765159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config" (OuterVolumeSpecName: "console-config") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.765404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.765485 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca" (OuterVolumeSpecName: "service-ca") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.765968 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.770575 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6" (OuterVolumeSpecName: "kube-api-access-85wd6") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "kube-api-access-85wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.771012 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.774183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a366b491-4c3c-40a9-86a0-a82d686b1a15" (UID: "a366b491-4c3c-40a9-86a0-a82d686b1a15"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865752 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865779 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865789 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865798 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85wd6\" (UniqueName: \"kubernetes.io/projected/a366b491-4c3c-40a9-86a0-a82d686b1a15-kube-api-access-85wd6\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865808 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865818 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a366b491-4c3c-40a9-86a0-a82d686b1a15-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:12 crc kubenswrapper[4909]: I1201 10:43:12.865828 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a366b491-4c3c-40a9-86a0-a82d686b1a15-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280435 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wfgm2_a366b491-4c3c-40a9-86a0-a82d686b1a15/console/0.log" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280515 4909 generic.go:334] "Generic (PLEG): container finished" podID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerID="2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c" exitCode=2 Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wfgm2" event={"ID":"a366b491-4c3c-40a9-86a0-a82d686b1a15","Type":"ContainerDied","Data":"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c"} Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280639 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wfgm2" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280665 4909 scope.go:117] "RemoveContainer" containerID="2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.280677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wfgm2" event={"ID":"a366b491-4c3c-40a9-86a0-a82d686b1a15","Type":"ContainerDied","Data":"9bb9d4c1d5385aed3314778bb88d9cb86cea8ef02c20db0d8a114bfdac19d090"} Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.312697 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.314636 4909 scope.go:117] "RemoveContainer" containerID="2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c" Dec 01 10:43:13 crc kubenswrapper[4909]: E1201 10:43:13.315288 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c\": container with ID starting with 2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c not found: ID does not exist" containerID="2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.315347 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c"} err="failed to get container status \"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c\": rpc error: code = NotFound desc = could not find container \"2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c\": container with ID starting with 2dcc3ccc9d7778a7c6fa74d7b692a89a00873b2026aa0daaa5ef7f4d1d939d8c not found: ID does not exist" Dec 01 10:43:13 crc kubenswrapper[4909]: I1201 10:43:13.320552 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wfgm2"] Dec 01 10:43:14 crc kubenswrapper[4909]: I1201 10:43:14.287824 4909 generic.go:334] "Generic (PLEG): container finished" podID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerID="e8add48bf08f6f262584bf10760f30c9519aff3dc92e6d590f2ea29110e57ddf" exitCode=0 Dec 01 10:43:14 crc kubenswrapper[4909]: I1201 10:43:14.287930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" event={"ID":"a1beb94c-1e21-4dd7-814e-c345e45dc803","Type":"ContainerDied","Data":"e8add48bf08f6f262584bf10760f30c9519aff3dc92e6d590f2ea29110e57ddf"} Dec 01 10:43:15 crc kubenswrapper[4909]: I1201 10:43:15.265693 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" path="/var/lib/kubelet/pods/a366b491-4c3c-40a9-86a0-a82d686b1a15/volumes" Dec 01 10:43:15 crc kubenswrapper[4909]: I1201 10:43:15.300490 4909 generic.go:334] "Generic (PLEG): container finished" podID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerID="9615e546899408842c39b534066047a14c9af927a7adecb4d15a44e4e28efb96" exitCode=0 Dec 01 10:43:15 crc kubenswrapper[4909]: I1201 10:43:15.300556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" event={"ID":"a1beb94c-1e21-4dd7-814e-c345e45dc803","Type":"ContainerDied","Data":"9615e546899408842c39b534066047a14c9af927a7adecb4d15a44e4e28efb96"} Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.547710 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.622695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle\") pod \"a1beb94c-1e21-4dd7-814e-c345e45dc803\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.624030 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle" (OuterVolumeSpecName: "bundle") pod "a1beb94c-1e21-4dd7-814e-c345e45dc803" (UID: "a1beb94c-1e21-4dd7-814e-c345e45dc803"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.624211 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqmdq\" (UniqueName: \"kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq\") pod \"a1beb94c-1e21-4dd7-814e-c345e45dc803\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.624265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util\") pod \"a1beb94c-1e21-4dd7-814e-c345e45dc803\" (UID: \"a1beb94c-1e21-4dd7-814e-c345e45dc803\") " Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.629604 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.630996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq" (OuterVolumeSpecName: "kube-api-access-mqmdq") pod "a1beb94c-1e21-4dd7-814e-c345e45dc803" (UID: "a1beb94c-1e21-4dd7-814e-c345e45dc803"). InnerVolumeSpecName "kube-api-access-mqmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.643926 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util" (OuterVolumeSpecName: "util") pod "a1beb94c-1e21-4dd7-814e-c345e45dc803" (UID: "a1beb94c-1e21-4dd7-814e-c345e45dc803"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.730771 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqmdq\" (UniqueName: \"kubernetes.io/projected/a1beb94c-1e21-4dd7-814e-c345e45dc803-kube-api-access-mqmdq\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:16 crc kubenswrapper[4909]: I1201 10:43:16.730841 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a1beb94c-1e21-4dd7-814e-c345e45dc803-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:43:17 crc kubenswrapper[4909]: I1201 10:43:17.328592 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" event={"ID":"a1beb94c-1e21-4dd7-814e-c345e45dc803","Type":"ContainerDied","Data":"cdf7bf0eec6b9282174e3972c489cf9ef0a3bf0c279160d7cfc397461792c53a"} Dec 01 10:43:17 crc kubenswrapper[4909]: I1201 10:43:17.328668 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf7bf0eec6b9282174e3972c489cf9ef0a3bf0c279160d7cfc397461792c53a" Dec 01 10:43:17 crc kubenswrapper[4909]: I1201 10:43:17.328711 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.581648 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd"] Dec 01 10:43:26 crc kubenswrapper[4909]: E1201 10:43:26.582442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="util" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582456 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="util" Dec 01 10:43:26 crc kubenswrapper[4909]: E1201 10:43:26.582467 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="pull" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582474 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="pull" Dec 01 10:43:26 crc kubenswrapper[4909]: E1201 10:43:26.582484 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="extract" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582492 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="extract" Dec 01 10:43:26 crc kubenswrapper[4909]: E1201 10:43:26.582501 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerName="console" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582507 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerName="console" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582610 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a366b491-4c3c-40a9-86a0-a82d686b1a15" containerName="console" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.582627 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1beb94c-1e21-4dd7-814e-c345e45dc803" containerName="extract" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.583097 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.585657 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.585866 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.586060 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4bstd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.586074 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.589241 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.598966 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd"] Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.664422 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-webhook-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.664516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mwl\" (UniqueName: \"kubernetes.io/projected/2dd20de1-a28e-4db1-9776-e206ca717a54-kube-api-access-g9mwl\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.664743 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-apiservice-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.766324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-apiservice-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.766437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-webhook-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.766499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mwl\" (UniqueName: \"kubernetes.io/projected/2dd20de1-a28e-4db1-9776-e206ca717a54-kube-api-access-g9mwl\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.772807 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-webhook-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.773384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dd20de1-a28e-4db1-9776-e206ca717a54-apiservice-cert\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.783391 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mwl\" (UniqueName: \"kubernetes.io/projected/2dd20de1-a28e-4db1-9776-e206ca717a54-kube-api-access-g9mwl\") pod \"metallb-operator-controller-manager-dd49f7fb7-z5zjd\" (UID: \"2dd20de1-a28e-4db1-9776-e206ca717a54\") " pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.833976 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8"] Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.834801 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.837379 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.837996 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.838578 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2r2x7" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.868180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-apiservice-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.868253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njtc9\" (UniqueName: \"kubernetes.io/projected/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-kube-api-access-njtc9\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.868296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-webhook-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.900034 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.908771 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8"] Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.969539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njtc9\" (UniqueName: \"kubernetes.io/projected/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-kube-api-access-njtc9\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.969594 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-webhook-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.969666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-apiservice-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.978833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-webhook-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.990433 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-apiservice-cert\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:26 crc kubenswrapper[4909]: I1201 10:43:26.998161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njtc9\" (UniqueName: \"kubernetes.io/projected/40db0db9-27bb-4dfe-aff9-a9a8dcd67142-kube-api-access-njtc9\") pod \"metallb-operator-webhook-server-9cd4d6db-x89s8\" (UID: \"40db0db9-27bb-4dfe-aff9-a9a8dcd67142\") " pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:27 crc kubenswrapper[4909]: I1201 10:43:27.151486 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:27 crc kubenswrapper[4909]: I1201 10:43:27.168404 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd"] Dec 01 10:43:27 crc kubenswrapper[4909]: W1201 10:43:27.176167 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd20de1_a28e_4db1_9776_e206ca717a54.slice/crio-b91ef8d4ea4fff867a74bb8169e16bbed672735ede8fdd3af7110d5deb164175 WatchSource:0}: Error finding container b91ef8d4ea4fff867a74bb8169e16bbed672735ede8fdd3af7110d5deb164175: Status 404 returned error can't find the container with id b91ef8d4ea4fff867a74bb8169e16bbed672735ede8fdd3af7110d5deb164175 Dec 01 10:43:27 crc kubenswrapper[4909]: I1201 10:43:27.404904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" event={"ID":"2dd20de1-a28e-4db1-9776-e206ca717a54","Type":"ContainerStarted","Data":"b91ef8d4ea4fff867a74bb8169e16bbed672735ede8fdd3af7110d5deb164175"} Dec 01 10:43:27 crc kubenswrapper[4909]: I1201 10:43:27.475625 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8"] Dec 01 10:43:27 crc kubenswrapper[4909]: W1201 10:43:27.484389 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40db0db9_27bb_4dfe_aff9_a9a8dcd67142.slice/crio-0e14c5d83d505b1c98db49fcaef2ead05bcb2976f5bc89f7796af38718e2e33f WatchSource:0}: Error finding container 0e14c5d83d505b1c98db49fcaef2ead05bcb2976f5bc89f7796af38718e2e33f: Status 404 returned error can't find the container with id 0e14c5d83d505b1c98db49fcaef2ead05bcb2976f5bc89f7796af38718e2e33f Dec 01 10:43:28 crc kubenswrapper[4909]: I1201 10:43:28.412056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" event={"ID":"40db0db9-27bb-4dfe-aff9-a9a8dcd67142","Type":"ContainerStarted","Data":"0e14c5d83d505b1c98db49fcaef2ead05bcb2976f5bc89f7796af38718e2e33f"} Dec 01 10:43:30 crc kubenswrapper[4909]: I1201 10:43:30.428763 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" event={"ID":"2dd20de1-a28e-4db1-9776-e206ca717a54","Type":"ContainerStarted","Data":"c78431cfd280fdd64b5af25aeb3d36d2a7a62236b40d7f63ae26f34fb72e3bea"} Dec 01 10:43:30 crc kubenswrapper[4909]: I1201 10:43:30.456612 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" podStartSLOduration=1.400764071 podStartE2EDuration="4.456589725s" podCreationTimestamp="2025-12-01 10:43:26 +0000 UTC" firstStartedPulling="2025-12-01 10:43:27.180406591 +0000 UTC m=+724.414877489" lastFinishedPulling="2025-12-01 10:43:30.236232245 +0000 UTC m=+727.470703143" observedRunningTime="2025-12-01 10:43:30.451390485 +0000 UTC m=+727.685861403" watchObservedRunningTime="2025-12-01 10:43:30.456589725 +0000 UTC m=+727.691060623" Dec 01 10:43:31 crc kubenswrapper[4909]: I1201 10:43:31.440463 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:43:32 crc kubenswrapper[4909]: I1201 10:43:32.448925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" event={"ID":"40db0db9-27bb-4dfe-aff9-a9a8dcd67142","Type":"ContainerStarted","Data":"f2d6fffa48c63afd9117377f9a486440050f4a819d3ac311e4c4042179140f4a"} Dec 01 10:43:32 crc kubenswrapper[4909]: I1201 10:43:32.475381 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" podStartSLOduration=1.767074783 podStartE2EDuration="6.475364217s" podCreationTimestamp="2025-12-01 10:43:26 +0000 UTC" firstStartedPulling="2025-12-01 10:43:27.487638654 +0000 UTC m=+724.722109552" lastFinishedPulling="2025-12-01 10:43:32.195928088 +0000 UTC m=+729.430398986" observedRunningTime="2025-12-01 10:43:32.467949798 +0000 UTC m=+729.702420706" watchObservedRunningTime="2025-12-01 10:43:32.475364217 +0000 UTC m=+729.709835115" Dec 01 10:43:33 crc kubenswrapper[4909]: I1201 10:43:33.454741 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:36 crc kubenswrapper[4909]: I1201 10:43:36.193635 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:43:36 crc kubenswrapper[4909]: I1201 10:43:36.193729 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:43:47 crc kubenswrapper[4909]: I1201 10:43:47.161339 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9cd4d6db-x89s8" Dec 01 10:43:54 crc kubenswrapper[4909]: I1201 10:43:54.426900 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.194458 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.195495 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.195588 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.196588 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.196672 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3" gracePeriod=600 Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.664858 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3" exitCode=0 Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.664909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3"} Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.665214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8"} Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.665238 4909 scope.go:117] "RemoveContainer" containerID="cebd226375ed9ea525958531f3c656022fcef61d7c35f6db43a2b23cac24085f" Dec 01 10:44:06 crc kubenswrapper[4909]: I1201 10:44:06.903166 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-dd49f7fb7-z5zjd" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.699952 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.701307 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.709445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6fmj2" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.710413 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.705863 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xdgcz"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.718420 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.723202 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.723594 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.728215 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-sockets\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-reloader\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783409 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwwlj\" (UniqueName: \"kubernetes.io/projected/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-kube-api-access-dwwlj\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783433 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5krfw\" (UniqueName: \"kubernetes.io/projected/4e84529d-1222-4db1-9488-1ed872b096af-kube-api-access-5krfw\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783713 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-conf\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-startup\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.783860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e84529d-1222-4db1-9488-1ed872b096af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.784010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.814298 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lz22m"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.815615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.818350 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.818964 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-x9f7z" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.819199 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.819333 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.827636 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-zvqbb"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.832933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.838135 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.854257 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-zvqbb"] Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886627 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5krfw\" (UniqueName: \"kubernetes.io/projected/4e84529d-1222-4db1-9488-1ed872b096af-kube-api-access-5krfw\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886651 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-conf\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-metrics-certs\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886693 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-startup\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886722 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e84529d-1222-4db1-9488-1ed872b096af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886752 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbthq\" (UniqueName: \"kubernetes.io/projected/359939eb-1948-4c14-b573-16e8342ce29a-kube-api-access-bbthq\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/359939eb-1948-4c14-b573-16e8342ce29a-metallb-excludel2\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886823 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-sockets\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-reloader\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwwlj\" (UniqueName: \"kubernetes.io/projected/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-kube-api-access-dwwlj\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.886919 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: E1201 10:44:07.887041 4909 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 01 10:44:07 crc kubenswrapper[4909]: E1201 10:44:07.887099 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs podName:5419a1e9-7227-4f9e-89b2-ecb8469c1cd6 nodeName:}" failed. No retries permitted until 2025-12-01 10:44:08.387082146 +0000 UTC m=+765.621553044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs") pod "frr-k8s-xdgcz" (UID: "5419a1e9-7227-4f9e-89b2-ecb8469c1cd6") : secret "frr-k8s-certs-secret" not found Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.888072 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-conf\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.888753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-startup\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.895804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e84529d-1222-4db1-9488-1ed872b096af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.896052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.896287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-frr-sockets\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.896501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-reloader\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.931710 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5krfw\" (UniqueName: \"kubernetes.io/projected/4e84529d-1222-4db1-9488-1ed872b096af-kube-api-access-5krfw\") pod \"frr-k8s-webhook-server-7fcb986d4-22tnq\" (UID: \"4e84529d-1222-4db1-9488-1ed872b096af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.934320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwwlj\" (UniqueName: \"kubernetes.io/projected/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-kube-api-access-dwwlj\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.987813 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbthq\" (UniqueName: \"kubernetes.io/projected/359939eb-1948-4c14-b573-16e8342ce29a-kube-api-access-bbthq\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.987911 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/359939eb-1948-4c14-b573-16e8342ce29a-metallb-excludel2\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.987943 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-cert\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.987998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-metrics-certs\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.988060 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.988087 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-metrics-certs\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.988107 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krvf\" (UniqueName: \"kubernetes.io/projected/c3fe137b-0797-46b2-b512-cf8fa383f42d-kube-api-access-8krvf\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:07 crc kubenswrapper[4909]: E1201 10:44:07.988472 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:44:07 crc kubenswrapper[4909]: E1201 10:44:07.988526 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist podName:359939eb-1948-4c14-b573-16e8342ce29a nodeName:}" failed. No retries permitted until 2025-12-01 10:44:08.488508708 +0000 UTC m=+765.722979606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist") pod "speaker-lz22m" (UID: "359939eb-1948-4c14-b573-16e8342ce29a") : secret "metallb-memberlist" not found Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.988897 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/359939eb-1948-4c14-b573-16e8342ce29a-metallb-excludel2\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:07 crc kubenswrapper[4909]: I1201 10:44:07.991797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-metrics-certs\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.003754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbthq\" (UniqueName: \"kubernetes.io/projected/359939eb-1948-4c14-b573-16e8342ce29a-kube-api-access-bbthq\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.028244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.089312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-cert\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.089374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-metrics-certs\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.089464 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8krvf\" (UniqueName: \"kubernetes.io/projected/c3fe137b-0797-46b2-b512-cf8fa383f42d-kube-api-access-8krvf\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.095566 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-cert\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.096511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fe137b-0797-46b2-b512-cf8fa383f42d-metrics-certs\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.107946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krvf\" (UniqueName: \"kubernetes.io/projected/c3fe137b-0797-46b2-b512-cf8fa383f42d-kube-api-access-8krvf\") pod \"controller-f8648f98b-zvqbb\" (UID: \"c3fe137b-0797-46b2-b512-cf8fa383f42d\") " pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.149316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.366460 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-zvqbb"] Dec 01 10:44:08 crc kubenswrapper[4909]: W1201 10:44:08.369143 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fe137b_0797_46b2_b512_cf8fa383f42d.slice/crio-46c28dce72d0de4d5550be703eebb13b72ff45e8f4f4bb2d044cc3e81be3ae1b WatchSource:0}: Error finding container 46c28dce72d0de4d5550be703eebb13b72ff45e8f4f4bb2d044cc3e81be3ae1b: Status 404 returned error can't find the container with id 46c28dce72d0de4d5550be703eebb13b72ff45e8f4f4bb2d044cc3e81be3ae1b Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.394268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.398179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5419a1e9-7227-4f9e-89b2-ecb8469c1cd6-metrics-certs\") pod \"frr-k8s-xdgcz\" (UID: \"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6\") " pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.470762 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq"] Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.495540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:08 crc kubenswrapper[4909]: E1201 10:44:08.495755 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:44:08 crc kubenswrapper[4909]: E1201 10:44:08.495856 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist podName:359939eb-1948-4c14-b573-16e8342ce29a nodeName:}" failed. No retries permitted until 2025-12-01 10:44:09.495836689 +0000 UTC m=+766.730307587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist") pod "speaker-lz22m" (UID: "359939eb-1948-4c14-b573-16e8342ce29a") : secret "metallb-memberlist" not found Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.637611 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.682949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zvqbb" event={"ID":"c3fe137b-0797-46b2-b512-cf8fa383f42d","Type":"ContainerStarted","Data":"a41948c24be31a6e60ecd6294f55f217ec57cbc6a718e4e0ff911e9cebddda15"} Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.682997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zvqbb" event={"ID":"c3fe137b-0797-46b2-b512-cf8fa383f42d","Type":"ContainerStarted","Data":"197ae9f9c28d15216aa352b95ec052971530fd9c32f82cb9c34a399241d2b7e6"} Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.683023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zvqbb" event={"ID":"c3fe137b-0797-46b2-b512-cf8fa383f42d","Type":"ContainerStarted","Data":"46c28dce72d0de4d5550be703eebb13b72ff45e8f4f4bb2d044cc3e81be3ae1b"} Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.683126 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.683838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" event={"ID":"4e84529d-1222-4db1-9488-1ed872b096af","Type":"ContainerStarted","Data":"1bf624d7f085057d9fa21afd2c4bb09562e58b7dc50206b4f62e5d342aad9d9a"} Dec 01 10:44:08 crc kubenswrapper[4909]: I1201 10:44:08.703622 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-zvqbb" podStartSLOduration=1.7036041119999998 podStartE2EDuration="1.703604112s" podCreationTimestamp="2025-12-01 10:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:44:08.70093556 +0000 UTC m=+765.935406478" watchObservedRunningTime="2025-12-01 10:44:08.703604112 +0000 UTC m=+765.938075010" Dec 01 10:44:09 crc kubenswrapper[4909]: I1201 10:44:09.509411 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:09 crc kubenswrapper[4909]: I1201 10:44:09.522452 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/359939eb-1948-4c14-b573-16e8342ce29a-memberlist\") pod \"speaker-lz22m\" (UID: \"359939eb-1948-4c14-b573-16e8342ce29a\") " pod="metallb-system/speaker-lz22m" Dec 01 10:44:09 crc kubenswrapper[4909]: I1201 10:44:09.636578 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lz22m" Dec 01 10:44:09 crc kubenswrapper[4909]: W1201 10:44:09.658608 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359939eb_1948_4c14_b573_16e8342ce29a.slice/crio-d6965ff01de2050347b5848b7472f46afed921059b53bab5ef52e19baca8c0a6 WatchSource:0}: Error finding container d6965ff01de2050347b5848b7472f46afed921059b53bab5ef52e19baca8c0a6: Status 404 returned error can't find the container with id d6965ff01de2050347b5848b7472f46afed921059b53bab5ef52e19baca8c0a6 Dec 01 10:44:09 crc kubenswrapper[4909]: I1201 10:44:09.705386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lz22m" event={"ID":"359939eb-1948-4c14-b573-16e8342ce29a","Type":"ContainerStarted","Data":"d6965ff01de2050347b5848b7472f46afed921059b53bab5ef52e19baca8c0a6"} Dec 01 10:44:09 crc kubenswrapper[4909]: I1201 10:44:09.706924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"9d5af23afb40fb954188e8a02b8974133e985a1cd406586e110760c3fd019539"} Dec 01 10:44:10 crc kubenswrapper[4909]: I1201 10:44:10.722208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lz22m" event={"ID":"359939eb-1948-4c14-b573-16e8342ce29a","Type":"ContainerStarted","Data":"8b925747156e7b46609877a84416069c96568c9f99ab08c3f371330ee004dc5c"} Dec 01 10:44:10 crc kubenswrapper[4909]: I1201 10:44:10.722620 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lz22m" event={"ID":"359939eb-1948-4c14-b573-16e8342ce29a","Type":"ContainerStarted","Data":"7ef6f4cf1396e6725c29852e159261856e5ed46a0ec280b43a8a88fc470c121a"} Dec 01 10:44:10 crc kubenswrapper[4909]: I1201 10:44:10.722659 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lz22m" Dec 01 10:44:10 crc kubenswrapper[4909]: I1201 10:44:10.744204 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lz22m" podStartSLOduration=3.744186315 podStartE2EDuration="3.744186315s" podCreationTimestamp="2025-12-01 10:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:44:10.743642138 +0000 UTC m=+767.978113046" watchObservedRunningTime="2025-12-01 10:44:10.744186315 +0000 UTC m=+767.978657213" Dec 01 10:44:16 crc kubenswrapper[4909]: I1201 10:44:16.774597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" event={"ID":"4e84529d-1222-4db1-9488-1ed872b096af","Type":"ContainerStarted","Data":"67a3538fb772ec3a4d88e1b6e5e3d38cf83452f47a0321c1d0db68dc7246f583"} Dec 01 10:44:16 crc kubenswrapper[4909]: I1201 10:44:16.775206 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:16 crc kubenswrapper[4909]: I1201 10:44:16.779102 4909 generic.go:334] "Generic (PLEG): container finished" podID="5419a1e9-7227-4f9e-89b2-ecb8469c1cd6" containerID="fe70a038a87b99df8ea7b2686bfce32f48b5d46f13be3391bc3f2a1ddc5fb9be" exitCode=0 Dec 01 10:44:16 crc kubenswrapper[4909]: I1201 10:44:16.779161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerDied","Data":"fe70a038a87b99df8ea7b2686bfce32f48b5d46f13be3391bc3f2a1ddc5fb9be"} Dec 01 10:44:16 crc kubenswrapper[4909]: I1201 10:44:16.804949 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" podStartSLOduration=2.482709897 podStartE2EDuration="9.804928925s" podCreationTimestamp="2025-12-01 10:44:07 +0000 UTC" firstStartedPulling="2025-12-01 10:44:08.4750741 +0000 UTC m=+765.709544988" lastFinishedPulling="2025-12-01 10:44:15.797293118 +0000 UTC m=+773.031764016" observedRunningTime="2025-12-01 10:44:16.801613754 +0000 UTC m=+774.036084662" watchObservedRunningTime="2025-12-01 10:44:16.804928925 +0000 UTC m=+774.039399823" Dec 01 10:44:17 crc kubenswrapper[4909]: I1201 10:44:17.787571 4909 generic.go:334] "Generic (PLEG): container finished" podID="5419a1e9-7227-4f9e-89b2-ecb8469c1cd6" containerID="bca310bf873632650a48930cbb349750d9af53ed3ee53815f64deb45045b9534" exitCode=0 Dec 01 10:44:17 crc kubenswrapper[4909]: I1201 10:44:17.787678 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerDied","Data":"bca310bf873632650a48930cbb349750d9af53ed3ee53815f64deb45045b9534"} Dec 01 10:44:18 crc kubenswrapper[4909]: I1201 10:44:18.152958 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-zvqbb" Dec 01 10:44:18 crc kubenswrapper[4909]: I1201 10:44:18.798403 4909 generic.go:334] "Generic (PLEG): container finished" podID="5419a1e9-7227-4f9e-89b2-ecb8469c1cd6" containerID="de87ad184f25a8b4963caedb905717c108ee92b5358b7407d05e2d8761037584" exitCode=0 Dec 01 10:44:18 crc kubenswrapper[4909]: I1201 10:44:18.798473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerDied","Data":"de87ad184f25a8b4963caedb905717c108ee92b5358b7407d05e2d8761037584"} Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.642284 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lz22m" Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.811517 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"4c277a3b84de32b75c5b51a0bce3a28b8d8b55f416246221eb94787ee9be9bae"} Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.811569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"4c613c745e7bb5dcdb33f59fe51be717b7680dafe8b739d15485bbba0076d5d1"} Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.811584 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"086f10517cc287d2cde2f614285df29ddc5303a0cfe3cd9a8bd2621744b2d6cf"} Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.811597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"4d835a4ba7b1b331de2110250baeb69fdcb0cdab30e316dc494b2beb52b8cb8e"} Dec 01 10:44:19 crc kubenswrapper[4909]: I1201 10:44:19.811611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"a3f3418501af92577561c2e3117d7bc031239bfbcd7cf05eb581f1f853e690ca"} Dec 01 10:44:20 crc kubenswrapper[4909]: I1201 10:44:20.827446 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xdgcz" event={"ID":"5419a1e9-7227-4f9e-89b2-ecb8469c1cd6","Type":"ContainerStarted","Data":"ddfc0b02d5ade7dac998d7b9e9c91a351a966aa47a9feb116799a49dd53255c2"} Dec 01 10:44:20 crc kubenswrapper[4909]: I1201 10:44:20.827904 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:20 crc kubenswrapper[4909]: I1201 10:44:20.861622 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xdgcz" podStartSLOduration=6.805766666 podStartE2EDuration="13.861604897s" podCreationTimestamp="2025-12-01 10:44:07 +0000 UTC" firstStartedPulling="2025-12-01 10:44:08.755549701 +0000 UTC m=+765.990020589" lastFinishedPulling="2025-12-01 10:44:15.811387912 +0000 UTC m=+773.045858820" observedRunningTime="2025-12-01 10:44:20.859678518 +0000 UTC m=+778.094149426" watchObservedRunningTime="2025-12-01 10:44:20.861604897 +0000 UTC m=+778.096075795" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.724361 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.727069 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.731477 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.731952 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.732193 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gmrrh" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.734763 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.848140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmdd\" (UniqueName: \"kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd\") pod \"openstack-operator-index-wtkps\" (UID: \"f05091b2-5737-4495-8154-7ec82505eb67\") " pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.949722 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmdd\" (UniqueName: \"kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd\") pod \"openstack-operator-index-wtkps\" (UID: \"f05091b2-5737-4495-8154-7ec82505eb67\") " pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:22 crc kubenswrapper[4909]: I1201 10:44:22.969606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmdd\" (UniqueName: \"kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd\") pod \"openstack-operator-index-wtkps\" (UID: \"f05091b2-5737-4495-8154-7ec82505eb67\") " pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:23 crc kubenswrapper[4909]: I1201 10:44:23.059526 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:23 crc kubenswrapper[4909]: I1201 10:44:23.463796 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:23 crc kubenswrapper[4909]: I1201 10:44:23.637916 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:23 crc kubenswrapper[4909]: I1201 10:44:23.721945 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:23 crc kubenswrapper[4909]: I1201 10:44:23.849495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtkps" event={"ID":"f05091b2-5737-4495-8154-7ec82505eb67","Type":"ContainerStarted","Data":"ff823f8df28cbf2cec481461c4a515bce45bb513394820824d761eb35883c8b8"} Dec 01 10:44:26 crc kubenswrapper[4909]: I1201 10:44:26.439967 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:26 crc kubenswrapper[4909]: I1201 10:44:26.871179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtkps" event={"ID":"f05091b2-5737-4495-8154-7ec82505eb67","Type":"ContainerStarted","Data":"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67"} Dec 01 10:44:26 crc kubenswrapper[4909]: I1201 10:44:26.891192 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wtkps" podStartSLOduration=2.500692657 podStartE2EDuration="4.891170394s" podCreationTimestamp="2025-12-01 10:44:22 +0000 UTC" firstStartedPulling="2025-12-01 10:44:23.467497706 +0000 UTC m=+780.701968604" lastFinishedPulling="2025-12-01 10:44:25.857975443 +0000 UTC m=+783.092446341" observedRunningTime="2025-12-01 10:44:26.886910651 +0000 UTC m=+784.121381549" watchObservedRunningTime="2025-12-01 10:44:26.891170394 +0000 UTC m=+784.125641292" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.046227 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c8hpw"] Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.047269 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.057409 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c8hpw"] Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.211710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnctf\" (UniqueName: \"kubernetes.io/projected/945864bb-ec20-4030-83cb-383340ff9107-kube-api-access-tnctf\") pod \"openstack-operator-index-c8hpw\" (UID: \"945864bb-ec20-4030-83cb-383340ff9107\") " pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.313905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnctf\" (UniqueName: \"kubernetes.io/projected/945864bb-ec20-4030-83cb-383340ff9107-kube-api-access-tnctf\") pod \"openstack-operator-index-c8hpw\" (UID: \"945864bb-ec20-4030-83cb-383340ff9107\") " pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.334370 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnctf\" (UniqueName: \"kubernetes.io/projected/945864bb-ec20-4030-83cb-383340ff9107-kube-api-access-tnctf\") pod \"openstack-operator-index-c8hpw\" (UID: \"945864bb-ec20-4030-83cb-383340ff9107\") " pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.371345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.610567 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c8hpw"] Dec 01 10:44:27 crc kubenswrapper[4909]: W1201 10:44:27.613006 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945864bb_ec20_4030_83cb_383340ff9107.slice/crio-305d9363c9f813a39501955b783eaf7c670794f8ae51b139d0b509402be28e37 WatchSource:0}: Error finding container 305d9363c9f813a39501955b783eaf7c670794f8ae51b139d0b509402be28e37: Status 404 returned error can't find the container with id 305d9363c9f813a39501955b783eaf7c670794f8ae51b139d0b509402be28e37 Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.878693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c8hpw" event={"ID":"945864bb-ec20-4030-83cb-383340ff9107","Type":"ContainerStarted","Data":"d0e8688aeb3ce077f1b7c27431cb841c9702287189ae70018361ce8e8ef7e1c8"} Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.879073 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c8hpw" event={"ID":"945864bb-ec20-4030-83cb-383340ff9107","Type":"ContainerStarted","Data":"305d9363c9f813a39501955b783eaf7c670794f8ae51b139d0b509402be28e37"} Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.878848 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wtkps" podUID="f05091b2-5737-4495-8154-7ec82505eb67" containerName="registry-server" containerID="cri-o://7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67" gracePeriod=2 Dec 01 10:44:27 crc kubenswrapper[4909]: I1201 10:44:27.903754 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c8hpw" podStartSLOduration=0.803075006 podStartE2EDuration="903.736883ms" podCreationTimestamp="2025-12-01 10:44:27 +0000 UTC" firstStartedPulling="2025-12-01 10:44:27.616318356 +0000 UTC m=+784.850789254" lastFinishedPulling="2025-12-01 10:44:27.716980243 +0000 UTC m=+784.951451131" observedRunningTime="2025-12-01 10:44:27.901486123 +0000 UTC m=+785.135957041" watchObservedRunningTime="2025-12-01 10:44:27.903736883 +0000 UTC m=+785.138207771" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.033496 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-22tnq" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.220728 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.329191 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzmdd\" (UniqueName: \"kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd\") pod \"f05091b2-5737-4495-8154-7ec82505eb67\" (UID: \"f05091b2-5737-4495-8154-7ec82505eb67\") " Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.335121 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd" (OuterVolumeSpecName: "kube-api-access-bzmdd") pod "f05091b2-5737-4495-8154-7ec82505eb67" (UID: "f05091b2-5737-4495-8154-7ec82505eb67"). InnerVolumeSpecName "kube-api-access-bzmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.430745 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzmdd\" (UniqueName: \"kubernetes.io/projected/f05091b2-5737-4495-8154-7ec82505eb67-kube-api-access-bzmdd\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.643860 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xdgcz" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.886415 4909 generic.go:334] "Generic (PLEG): container finished" podID="f05091b2-5737-4495-8154-7ec82505eb67" containerID="7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67" exitCode=0 Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.886473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtkps" event={"ID":"f05091b2-5737-4495-8154-7ec82505eb67","Type":"ContainerDied","Data":"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67"} Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.886502 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtkps" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.886530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtkps" event={"ID":"f05091b2-5737-4495-8154-7ec82505eb67","Type":"ContainerDied","Data":"ff823f8df28cbf2cec481461c4a515bce45bb513394820824d761eb35883c8b8"} Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.886581 4909 scope.go:117] "RemoveContainer" containerID="7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.907407 4909 scope.go:117] "RemoveContainer" containerID="7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67" Dec 01 10:44:28 crc kubenswrapper[4909]: E1201 10:44:28.907906 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67\": container with ID starting with 7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67 not found: ID does not exist" containerID="7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.907945 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67"} err="failed to get container status \"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67\": rpc error: code = NotFound desc = could not find container \"7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67\": container with ID starting with 7f4014afbca9a8d5765b0793ae1e540043576e5118df2a18183fd1ff4cb79b67 not found: ID does not exist" Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.914058 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:28 crc kubenswrapper[4909]: I1201 10:44:28.919671 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wtkps"] Dec 01 10:44:29 crc kubenswrapper[4909]: I1201 10:44:29.264831 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05091b2-5737-4495-8154-7ec82505eb67" path="/var/lib/kubelet/pods/f05091b2-5737-4495-8154-7ec82505eb67/volumes" Dec 01 10:44:37 crc kubenswrapper[4909]: I1201 10:44:37.371854 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:37 crc kubenswrapper[4909]: I1201 10:44:37.372530 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:37 crc kubenswrapper[4909]: I1201 10:44:37.411116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:37 crc kubenswrapper[4909]: I1201 10:44:37.996299 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c8hpw" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.482528 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd"] Dec 01 10:44:39 crc kubenswrapper[4909]: E1201 10:44:39.482803 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05091b2-5737-4495-8154-7ec82505eb67" containerName="registry-server" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.482816 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05091b2-5737-4495-8154-7ec82505eb67" containerName="registry-server" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.482948 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05091b2-5737-4495-8154-7ec82505eb67" containerName="registry-server" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.483764 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.489272 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-rfrml" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.500028 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd"] Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.601243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppv5r\" (UniqueName: \"kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.601334 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.601367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.702210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.702294 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppv5r\" (UniqueName: \"kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.702352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.702819 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.702901 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.730311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppv5r\" (UniqueName: \"kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r\") pod \"831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:39 crc kubenswrapper[4909]: I1201 10:44:39.802253 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:40 crc kubenswrapper[4909]: I1201 10:44:40.259180 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd"] Dec 01 10:44:40 crc kubenswrapper[4909]: W1201 10:44:40.264738 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e65ad27_ab58_4879_af77_a626651a5be9.slice/crio-24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8 WatchSource:0}: Error finding container 24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8: Status 404 returned error can't find the container with id 24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8 Dec 01 10:44:40 crc kubenswrapper[4909]: I1201 10:44:40.996261 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e65ad27-ab58-4879-af77-a626651a5be9" containerID="6d48ef990587c7d772e97d207d5cc578b33eb10cd472821d077941c3508353e5" exitCode=0 Dec 01 10:44:40 crc kubenswrapper[4909]: I1201 10:44:40.996310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" event={"ID":"6e65ad27-ab58-4879-af77-a626651a5be9","Type":"ContainerDied","Data":"6d48ef990587c7d772e97d207d5cc578b33eb10cd472821d077941c3508353e5"} Dec 01 10:44:40 crc kubenswrapper[4909]: I1201 10:44:40.996569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" event={"ID":"6e65ad27-ab58-4879-af77-a626651a5be9","Type":"ContainerStarted","Data":"24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8"} Dec 01 10:44:42 crc kubenswrapper[4909]: I1201 10:44:42.004561 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e65ad27-ab58-4879-af77-a626651a5be9" containerID="baaef42b19a9aab5ce66a6f7cab6d34fd6403aa5dc2366cc1c14da2b6f805ae8" exitCode=0 Dec 01 10:44:42 crc kubenswrapper[4909]: I1201 10:44:42.004653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" event={"ID":"6e65ad27-ab58-4879-af77-a626651a5be9","Type":"ContainerDied","Data":"baaef42b19a9aab5ce66a6f7cab6d34fd6403aa5dc2366cc1c14da2b6f805ae8"} Dec 01 10:44:43 crc kubenswrapper[4909]: I1201 10:44:43.022257 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e65ad27-ab58-4879-af77-a626651a5be9" containerID="82825ea62fb57a631ba3aad512d8de6b2aa0320ec1d63994e9bfe08a5fe51391" exitCode=0 Dec 01 10:44:43 crc kubenswrapper[4909]: I1201 10:44:43.022353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" event={"ID":"6e65ad27-ab58-4879-af77-a626651a5be9","Type":"ContainerDied","Data":"82825ea62fb57a631ba3aad512d8de6b2aa0320ec1d63994e9bfe08a5fe51391"} Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.264645 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.377568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle\") pod \"6e65ad27-ab58-4879-af77-a626651a5be9\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.377665 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util\") pod \"6e65ad27-ab58-4879-af77-a626651a5be9\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.377708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppv5r\" (UniqueName: \"kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r\") pod \"6e65ad27-ab58-4879-af77-a626651a5be9\" (UID: \"6e65ad27-ab58-4879-af77-a626651a5be9\") " Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.378689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle" (OuterVolumeSpecName: "bundle") pod "6e65ad27-ab58-4879-af77-a626651a5be9" (UID: "6e65ad27-ab58-4879-af77-a626651a5be9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.379637 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.383835 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r" (OuterVolumeSpecName: "kube-api-access-ppv5r") pod "6e65ad27-ab58-4879-af77-a626651a5be9" (UID: "6e65ad27-ab58-4879-af77-a626651a5be9"). InnerVolumeSpecName "kube-api-access-ppv5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.392360 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util" (OuterVolumeSpecName: "util") pod "6e65ad27-ab58-4879-af77-a626651a5be9" (UID: "6e65ad27-ab58-4879-af77-a626651a5be9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.481934 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppv5r\" (UniqueName: \"kubernetes.io/projected/6e65ad27-ab58-4879-af77-a626651a5be9-kube-api-access-ppv5r\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:44 crc kubenswrapper[4909]: I1201 10:44:44.481968 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e65ad27-ab58-4879-af77-a626651a5be9-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:45 crc kubenswrapper[4909]: I1201 10:44:45.037203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" event={"ID":"6e65ad27-ab58-4879-af77-a626651a5be9","Type":"ContainerDied","Data":"24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8"} Dec 01 10:44:45 crc kubenswrapper[4909]: I1201 10:44:45.037280 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b6176eb5c0320febfc6a7d5d5162047b512cad26969c9810795afeabd6dac8" Dec 01 10:44:45 crc kubenswrapper[4909]: I1201 10:44:45.037285 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.519708 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts"] Dec 01 10:44:51 crc kubenswrapper[4909]: E1201 10:44:51.524767 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="extract" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.524958 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="extract" Dec 01 10:44:51 crc kubenswrapper[4909]: E1201 10:44:51.525073 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="pull" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.525144 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="pull" Dec 01 10:44:51 crc kubenswrapper[4909]: E1201 10:44:51.525212 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="util" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.525278 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="util" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.525530 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e65ad27-ab58-4879-af77-a626651a5be9" containerName="extract" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.526331 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.535061 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-v2jw4" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.552284 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts"] Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.593802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shzmv\" (UniqueName: \"kubernetes.io/projected/3ec99ef5-053c-4815-b711-aa6798c05e05-kube-api-access-shzmv\") pod \"openstack-operator-controller-operator-7f6d49fd8b-fs7ts\" (UID: \"3ec99ef5-053c-4815-b711-aa6798c05e05\") " pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.697076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shzmv\" (UniqueName: \"kubernetes.io/projected/3ec99ef5-053c-4815-b711-aa6798c05e05-kube-api-access-shzmv\") pod \"openstack-operator-controller-operator-7f6d49fd8b-fs7ts\" (UID: \"3ec99ef5-053c-4815-b711-aa6798c05e05\") " pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.723354 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shzmv\" (UniqueName: \"kubernetes.io/projected/3ec99ef5-053c-4815-b711-aa6798c05e05-kube-api-access-shzmv\") pod \"openstack-operator-controller-operator-7f6d49fd8b-fs7ts\" (UID: \"3ec99ef5-053c-4815-b711-aa6798c05e05\") " pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:51 crc kubenswrapper[4909]: I1201 10:44:51.888050 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:52 crc kubenswrapper[4909]: I1201 10:44:52.382089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts"] Dec 01 10:44:52 crc kubenswrapper[4909]: W1201 10:44:52.402156 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec99ef5_053c_4815_b711_aa6798c05e05.slice/crio-e5cf6022b1c49dc9ce1b0525f39f4b6c54c2a94e1c899b14cf596a7fdfa39942 WatchSource:0}: Error finding container e5cf6022b1c49dc9ce1b0525f39f4b6c54c2a94e1c899b14cf596a7fdfa39942: Status 404 returned error can't find the container with id e5cf6022b1c49dc9ce1b0525f39f4b6c54c2a94e1c899b14cf596a7fdfa39942 Dec 01 10:44:53 crc kubenswrapper[4909]: I1201 10:44:53.091328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" event={"ID":"3ec99ef5-053c-4815-b711-aa6798c05e05","Type":"ContainerStarted","Data":"e5cf6022b1c49dc9ce1b0525f39f4b6c54c2a94e1c899b14cf596a7fdfa39942"} Dec 01 10:44:58 crc kubenswrapper[4909]: I1201 10:44:58.128670 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" event={"ID":"3ec99ef5-053c-4815-b711-aa6798c05e05","Type":"ContainerStarted","Data":"1f8fcaff9df4417c53d18ed9d9178643167a41c6a52e98f4c2b0b2c328235be5"} Dec 01 10:44:58 crc kubenswrapper[4909]: I1201 10:44:58.129391 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:44:58 crc kubenswrapper[4909]: I1201 10:44:58.161116 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" podStartSLOduration=2.592740407 podStartE2EDuration="7.161084597s" podCreationTimestamp="2025-12-01 10:44:51 +0000 UTC" firstStartedPulling="2025-12-01 10:44:52.403993076 +0000 UTC m=+809.638463974" lastFinishedPulling="2025-12-01 10:44:56.972337246 +0000 UTC m=+814.206808164" observedRunningTime="2025-12-01 10:44:58.155858964 +0000 UTC m=+815.390329862" watchObservedRunningTime="2025-12-01 10:44:58.161084597 +0000 UTC m=+815.395555495" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.159119 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9"] Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.160136 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.162110 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.163362 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.177510 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9"] Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.251699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.251800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.251902 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs59b\" (UniqueName: \"kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.353135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.353191 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.353259 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs59b\" (UniqueName: \"kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.354584 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.359848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.372194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs59b\" (UniqueName: \"kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b\") pod \"collect-profiles-29409765-sh7p9\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.481378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:00 crc kubenswrapper[4909]: I1201 10:45:00.697280 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9"] Dec 01 10:45:01 crc kubenswrapper[4909]: I1201 10:45:01.154291 4909 generic.go:334] "Generic (PLEG): container finished" podID="66988836-e4d6-497d-bab2-d52170d8d0ef" containerID="2826083d925be0994bc5dc78ccdbc853fca22a49abde0e4fef165ee1194f4710" exitCode=0 Dec 01 10:45:01 crc kubenswrapper[4909]: I1201 10:45:01.154436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" event={"ID":"66988836-e4d6-497d-bab2-d52170d8d0ef","Type":"ContainerDied","Data":"2826083d925be0994bc5dc78ccdbc853fca22a49abde0e4fef165ee1194f4710"} Dec 01 10:45:01 crc kubenswrapper[4909]: I1201 10:45:01.154973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" event={"ID":"66988836-e4d6-497d-bab2-d52170d8d0ef","Type":"ContainerStarted","Data":"3693a701736d36a211fd86190aa7fbdc61c7738877869ecef22b01dd58cfb63e"} Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.405076 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.484114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume\") pod \"66988836-e4d6-497d-bab2-d52170d8d0ef\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.484378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume\") pod \"66988836-e4d6-497d-bab2-d52170d8d0ef\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.484518 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs59b\" (UniqueName: \"kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b\") pod \"66988836-e4d6-497d-bab2-d52170d8d0ef\" (UID: \"66988836-e4d6-497d-bab2-d52170d8d0ef\") " Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.485436 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "66988836-e4d6-497d-bab2-d52170d8d0ef" (UID: "66988836-e4d6-497d-bab2-d52170d8d0ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.492223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "66988836-e4d6-497d-bab2-d52170d8d0ef" (UID: "66988836-e4d6-497d-bab2-d52170d8d0ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.493076 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b" (OuterVolumeSpecName: "kube-api-access-gs59b") pod "66988836-e4d6-497d-bab2-d52170d8d0ef" (UID: "66988836-e4d6-497d-bab2-d52170d8d0ef"). InnerVolumeSpecName "kube-api-access-gs59b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.586044 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs59b\" (UniqueName: \"kubernetes.io/projected/66988836-e4d6-497d-bab2-d52170d8d0ef-kube-api-access-gs59b\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.586094 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66988836-e4d6-497d-bab2-d52170d8d0ef-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4909]: I1201 10:45:02.586107 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66988836-e4d6-497d-bab2-d52170d8d0ef-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4909]: I1201 10:45:03.167116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" event={"ID":"66988836-e4d6-497d-bab2-d52170d8d0ef","Type":"ContainerDied","Data":"3693a701736d36a211fd86190aa7fbdc61c7738877869ecef22b01dd58cfb63e"} Dec 01 10:45:03 crc kubenswrapper[4909]: I1201 10:45:03.167162 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9" Dec 01 10:45:03 crc kubenswrapper[4909]: I1201 10:45:03.167177 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3693a701736d36a211fd86190aa7fbdc61c7738877869ecef22b01dd58cfb63e" Dec 01 10:45:11 crc kubenswrapper[4909]: I1201 10:45:11.891086 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7f6d49fd8b-fs7ts" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.590428 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg"] Dec 01 10:45:30 crc kubenswrapper[4909]: E1201 10:45:30.591267 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66988836-e4d6-497d-bab2-d52170d8d0ef" containerName="collect-profiles" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.591283 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="66988836-e4d6-497d-bab2-d52170d8d0ef" containerName="collect-profiles" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.591424 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="66988836-e4d6-497d-bab2-d52170d8d0ef" containerName="collect-profiles" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.592256 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.594245 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-8fl22" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.599581 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.600555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.604655 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4z8m6" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.617322 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.618749 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.620810 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bvg9w" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.651861 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.653773 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.658546 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.660190 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-fxtmz" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.684141 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.696642 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.705829 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhk7f\" (UniqueName: \"kubernetes.io/projected/19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1-kube-api-access-mhk7f\") pod \"designate-operator-controller-manager-78b4bc895b-77s9g\" (UID: \"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.705937 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xdhx\" (UniqueName: \"kubernetes.io/projected/dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c-kube-api-access-4xdhx\") pod \"barbican-operator-controller-manager-7d9dfd778-hdfcg\" (UID: \"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.705979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpcrj\" (UniqueName: \"kubernetes.io/projected/cfac3bef-3c49-47ba-94be-d267a57f731a-kube-api-access-vpcrj\") pod \"cinder-operator-controller-manager-859b6ccc6-9txzx\" (UID: \"cfac3bef-3c49-47ba-94be-d267a57f731a\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.725019 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.726275 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.733859 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.737772 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nqpjk" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.747177 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.783401 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.784889 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.791912 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mjxhk" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.809751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xdhx\" (UniqueName: \"kubernetes.io/projected/dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c-kube-api-access-4xdhx\") pod \"barbican-operator-controller-manager-7d9dfd778-hdfcg\" (UID: \"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.809811 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tgh\" (UniqueName: \"kubernetes.io/projected/9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41-kube-api-access-q6tgh\") pod \"glance-operator-controller-manager-8c7f494db-q9n6n\" (UID: \"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41\") " pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.809847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpcrj\" (UniqueName: \"kubernetes.io/projected/cfac3bef-3c49-47ba-94be-d267a57f731a-kube-api-access-vpcrj\") pod \"cinder-operator-controller-manager-859b6ccc6-9txzx\" (UID: \"cfac3bef-3c49-47ba-94be-d267a57f731a\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.809894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb62h\" (UniqueName: \"kubernetes.io/projected/67a7fb10-11c3-4fb1-90c7-ea30b122719f-kube-api-access-bb62h\") pod \"heat-operator-controller-manager-5f64f6f8bb-z98b8\" (UID: \"67a7fb10-11c3-4fb1-90c7-ea30b122719f\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.809979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhk7f\" (UniqueName: \"kubernetes.io/projected/19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1-kube-api-access-mhk7f\") pod \"designate-operator-controller-manager-78b4bc895b-77s9g\" (UID: \"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.811634 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.839698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xdhx\" (UniqueName: \"kubernetes.io/projected/dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c-kube-api-access-4xdhx\") pod \"barbican-operator-controller-manager-7d9dfd778-hdfcg\" (UID: \"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.844175 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpcrj\" (UniqueName: \"kubernetes.io/projected/cfac3bef-3c49-47ba-94be-d267a57f731a-kube-api-access-vpcrj\") pod \"cinder-operator-controller-manager-859b6ccc6-9txzx\" (UID: \"cfac3bef-3c49-47ba-94be-d267a57f731a\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.850517 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.852738 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.856405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ddq5g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.856742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.860978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhk7f\" (UniqueName: \"kubernetes.io/projected/19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1-kube-api-access-mhk7f\") pod \"designate-operator-controller-manager-78b4bc895b-77s9g\" (UID: \"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.891203 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.902362 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.906496 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h49zc" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.917941 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz"] Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.934839 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tgh\" (UniqueName: \"kubernetes.io/projected/9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41-kube-api-access-q6tgh\") pod \"glance-operator-controller-manager-8c7f494db-q9n6n\" (UID: \"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41\") " pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.934949 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb62h\" (UniqueName: \"kubernetes.io/projected/67a7fb10-11c3-4fb1-90c7-ea30b122719f-kube-api-access-bb62h\") pod \"heat-operator-controller-manager-5f64f6f8bb-z98b8\" (UID: \"67a7fb10-11c3-4fb1-90c7-ea30b122719f\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.935105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6x5\" (UniqueName: \"kubernetes.io/projected/1cedb704-1d3a-4c68-b14e-00bc65309891-kube-api-access-ds6x5\") pod \"horizon-operator-controller-manager-68c6d99b8f-wjqrn\" (UID: \"1cedb704-1d3a-4c68-b14e-00bc65309891\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.935500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.945770 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.958298 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:45:30 crc kubenswrapper[4909]: I1201 10:45:30.973261 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb62h\" (UniqueName: \"kubernetes.io/projected/67a7fb10-11c3-4fb1-90c7-ea30b122719f-kube-api-access-bb62h\") pod \"heat-operator-controller-manager-5f64f6f8bb-z98b8\" (UID: \"67a7fb10-11c3-4fb1-90c7-ea30b122719f\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.020525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tgh\" (UniqueName: \"kubernetes.io/projected/9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41-kube-api-access-q6tgh\") pod \"glance-operator-controller-manager-8c7f494db-q9n6n\" (UID: \"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41\") " pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.029195 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.030742 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.042542 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jpqrt" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.042919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74qpf\" (UniqueName: \"kubernetes.io/projected/949868bd-22ef-467f-b767-2d6dbd02dab1-kube-api-access-74qpf\") pod \"ironic-operator-controller-manager-6c548fd776-nxbp4\" (UID: \"949868bd-22ef-467f-b767-2d6dbd02dab1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.042996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8zb\" (UniqueName: \"kubernetes.io/projected/246fed43-c776-4c8c-a6ac-1f4364191266-kube-api-access-2w8zb\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.043047 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6x5\" (UniqueName: \"kubernetes.io/projected/1cedb704-1d3a-4c68-b14e-00bc65309891-kube-api-access-ds6x5\") pod \"horizon-operator-controller-manager-68c6d99b8f-wjqrn\" (UID: \"1cedb704-1d3a-4c68-b14e-00bc65309891\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.043073 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.052980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.056793 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.090798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.098590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6x5\" (UniqueName: \"kubernetes.io/projected/1cedb704-1d3a-4c68-b14e-00bc65309891-kube-api-access-ds6x5\") pod \"horizon-operator-controller-manager-68c6d99b8f-wjqrn\" (UID: \"1cedb704-1d3a-4c68-b14e-00bc65309891\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.115895 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.116250 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.117183 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.119621 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7l7wj" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.145292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrtm\" (UniqueName: \"kubernetes.io/projected/79de64b2-dbd0-4877-9556-8fb199c9786c-kube-api-access-tdrtm\") pod \"keystone-operator-controller-manager-546d4bdf48-q7zr4\" (UID: \"79de64b2-dbd0-4877-9556-8fb199c9786c\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.145994 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74qpf\" (UniqueName: \"kubernetes.io/projected/949868bd-22ef-467f-b767-2d6dbd02dab1-kube-api-access-74qpf\") pod \"ironic-operator-controller-manager-6c548fd776-nxbp4\" (UID: \"949868bd-22ef-467f-b767-2d6dbd02dab1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.146133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8zb\" (UniqueName: \"kubernetes.io/projected/246fed43-c776-4c8c-a6ac-1f4364191266-kube-api-access-2w8zb\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.146235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.146440 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.147202 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert podName:246fed43-c776-4c8c-a6ac-1f4364191266 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:31.647176207 +0000 UTC m=+848.881647105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert") pod "infra-operator-controller-manager-57548d458d-wbqsz" (UID: "246fed43-c776-4c8c-a6ac-1f4364191266") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.149753 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.154438 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.155963 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.165108 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7vzv6" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.177981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8zb\" (UniqueName: \"kubernetes.io/projected/246fed43-c776-4c8c-a6ac-1f4364191266-kube-api-access-2w8zb\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.182812 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.186233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.192379 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ksltv" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.197174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74qpf\" (UniqueName: \"kubernetes.io/projected/949868bd-22ef-467f-b767-2d6dbd02dab1-kube-api-access-74qpf\") pod \"ironic-operator-controller-manager-6c548fd776-nxbp4\" (UID: \"949868bd-22ef-467f-b767-2d6dbd02dab1\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.194296 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.208904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.212440 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2vg4c" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.224991 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.235738 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.236659 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.248999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrtm\" (UniqueName: \"kubernetes.io/projected/79de64b2-dbd0-4877-9556-8fb199c9786c-kube-api-access-tdrtm\") pod \"keystone-operator-controller-manager-546d4bdf48-q7zr4\" (UID: \"79de64b2-dbd0-4877-9556-8fb199c9786c\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.249051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cn9z\" (UniqueName: \"kubernetes.io/projected/6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45-kube-api-access-7cn9z\") pod \"manila-operator-controller-manager-6546668bfd-twgwh\" (UID: \"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.249135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqjg5\" (UniqueName: \"kubernetes.io/projected/510857ee-52e0-4d2c-8fff-2cc3dabb0dfa-kube-api-access-pqjg5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2rz9d\" (UID: \"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.249590 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.276317 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.290159 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrtm\" (UniqueName: \"kubernetes.io/projected/79de64b2-dbd0-4877-9556-8fb199c9786c-kube-api-access-tdrtm\") pod \"keystone-operator-controller-manager-546d4bdf48-q7zr4\" (UID: \"79de64b2-dbd0-4877-9556-8fb199c9786c\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.316809 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.317808 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.317828 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.318582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.319545 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.326365 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nqb6r" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.326509 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xtfpr" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.327152 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.328364 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.328657 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.338037 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4mmst" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.356905 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcg5\" (UniqueName: \"kubernetes.io/projected/282ee9d9-b6ee-4dd7-bce4-419428bd744b-kube-api-access-mgcg5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zbkjz\" (UID: \"282ee9d9-b6ee-4dd7-bce4-419428bd744b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.357044 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cn9z\" (UniqueName: \"kubernetes.io/projected/6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45-kube-api-access-7cn9z\") pod \"manila-operator-controller-manager-6546668bfd-twgwh\" (UID: \"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.357090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqjg5\" (UniqueName: \"kubernetes.io/projected/510857ee-52e0-4d2c-8fff-2cc3dabb0dfa-kube-api-access-pqjg5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2rz9d\" (UID: \"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.357134 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4kw\" (UniqueName: \"kubernetes.io/projected/8a1199cd-df49-4078-b2ab-211f902bd097-kube-api-access-qb4kw\") pod \"octavia-operator-controller-manager-998648c74-fh7wp\" (UID: \"8a1199cd-df49-4078-b2ab-211f902bd097\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.387975 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.389346 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.398353 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fgfz5" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.411429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.421761 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cn9z\" (UniqueName: \"kubernetes.io/projected/6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45-kube-api-access-7cn9z\") pod \"manila-operator-controller-manager-6546668bfd-twgwh\" (UID: \"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.437804 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqjg5\" (UniqueName: \"kubernetes.io/projected/510857ee-52e0-4d2c-8fff-2cc3dabb0dfa-kube-api-access-pqjg5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2rz9d\" (UID: \"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.437909 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.446470 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.457487 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4kw\" (UniqueName: \"kubernetes.io/projected/8a1199cd-df49-4078-b2ab-211f902bd097-kube-api-access-qb4kw\") pod \"octavia-operator-controller-manager-998648c74-fh7wp\" (UID: \"8a1199cd-df49-4078-b2ab-211f902bd097\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458574 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcg5\" (UniqueName: \"kubernetes.io/projected/282ee9d9-b6ee-4dd7-bce4-419428bd744b-kube-api-access-mgcg5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zbkjz\" (UID: \"282ee9d9-b6ee-4dd7-bce4-419428bd744b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78d8\" (UniqueName: \"kubernetes.io/projected/577bc508-a3f5-4b97-9360-12b1dfee3890-kube-api-access-q78d8\") pod \"nova-operator-controller-manager-697bc559fc-pzh2j\" (UID: \"577bc508-a3f5-4b97-9360-12b1dfee3890\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458659 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gg7f\" (UniqueName: \"kubernetes.io/projected/c1265544-727e-4a41-a2e4-c612230cbbc0-kube-api-access-8gg7f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.458700 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdb6q\" (UniqueName: \"kubernetes.io/projected/51f475b0-ff8d-4438-8c91-c41ce79ea8d1-kube-api-access-tdb6q\") pod \"ovn-operator-controller-manager-b6456fdb6-v585g\" (UID: \"51f475b0-ff8d-4438-8c91-c41ce79ea8d1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.459201 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.464504 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rl5j8" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.467915 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.470005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.478468 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-kklmf" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.526234 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.542465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4kw\" (UniqueName: \"kubernetes.io/projected/8a1199cd-df49-4078-b2ab-211f902bd097-kube-api-access-qb4kw\") pod \"octavia-operator-controller-manager-998648c74-fh7wp\" (UID: \"8a1199cd-df49-4078-b2ab-211f902bd097\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.549161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcg5\" (UniqueName: \"kubernetes.io/projected/282ee9d9-b6ee-4dd7-bce4-419428bd744b-kube-api-access-mgcg5\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zbkjz\" (UID: \"282ee9d9-b6ee-4dd7-bce4-419428bd744b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xd67\" (UniqueName: \"kubernetes.io/projected/63d2cbce-2ef0-43e7-83da-8350663cf0c0-kube-api-access-4xd67\") pod \"swift-operator-controller-manager-5f8c65bbfc-wgsw2\" (UID: \"63d2cbce-2ef0-43e7-83da-8350663cf0c0\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8rx\" (UniqueName: \"kubernetes.io/projected/d527652d-7e68-4494-9e69-75296fb63932-kube-api-access-4s8rx\") pod \"placement-operator-controller-manager-78f8948974-7qnmh\" (UID: \"d527652d-7e68-4494-9e69-75296fb63932\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78d8\" (UniqueName: \"kubernetes.io/projected/577bc508-a3f5-4b97-9360-12b1dfee3890-kube-api-access-q78d8\") pod \"nova-operator-controller-manager-697bc559fc-pzh2j\" (UID: \"577bc508-a3f5-4b97-9360-12b1dfee3890\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gg7f\" (UniqueName: \"kubernetes.io/projected/c1265544-727e-4a41-a2e4-c612230cbbc0-kube-api-access-8gg7f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdb6q\" (UniqueName: \"kubernetes.io/projected/51f475b0-ff8d-4438-8c91-c41ce79ea8d1-kube-api-access-tdb6q\") pod \"ovn-operator-controller-manager-b6456fdb6-v585g\" (UID: \"51f475b0-ff8d-4438-8c91-c41ce79ea8d1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.560932 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhrcq\" (UniqueName: \"kubernetes.io/projected/3dcfb91f-b0e5-4087-99cb-09cec4cd5f72-kube-api-access-qhrcq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-dq9jz\" (UID: \"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.561203 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.561294 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:32.061266508 +0000 UTC m=+849.295737486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.584517 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.586407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.597308 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.663994 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78d8\" (UniqueName: \"kubernetes.io/projected/577bc508-a3f5-4b97-9360-12b1dfee3890-kube-api-access-q78d8\") pod \"nova-operator-controller-manager-697bc559fc-pzh2j\" (UID: \"577bc508-a3f5-4b97-9360-12b1dfee3890\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.665812 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdb6q\" (UniqueName: \"kubernetes.io/projected/51f475b0-ff8d-4438-8c91-c41ce79ea8d1-kube-api-access-tdb6q\") pod \"ovn-operator-controller-manager-b6456fdb6-v585g\" (UID: \"51f475b0-ff8d-4438-8c91-c41ce79ea8d1\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.669803 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gg7f\" (UniqueName: \"kubernetes.io/projected/c1265544-727e-4a41-a2e4-c612230cbbc0-kube-api-access-8gg7f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.671962 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.680681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8rx\" (UniqueName: \"kubernetes.io/projected/d527652d-7e68-4494-9e69-75296fb63932-kube-api-access-4s8rx\") pod \"placement-operator-controller-manager-78f8948974-7qnmh\" (UID: \"d527652d-7e68-4494-9e69-75296fb63932\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.681250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.681438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhrcq\" (UniqueName: \"kubernetes.io/projected/3dcfb91f-b0e5-4087-99cb-09cec4cd5f72-kube-api-access-qhrcq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-dq9jz\" (UID: \"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.681566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xd67\" (UniqueName: \"kubernetes.io/projected/63d2cbce-2ef0-43e7-83da-8350663cf0c0-kube-api-access-4xd67\") pod \"swift-operator-controller-manager-5f8c65bbfc-wgsw2\" (UID: \"63d2cbce-2ef0-43e7-83da-8350663cf0c0\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.681587 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: E1201 10:45:31.681665 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert podName:246fed43-c776-4c8c-a6ac-1f4364191266 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:32.681632467 +0000 UTC m=+849.916103365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert") pod "infra-operator-controller-manager-57548d458d-wbqsz" (UID: "246fed43-c776-4c8c-a6ac-1f4364191266") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.737161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8rx\" (UniqueName: \"kubernetes.io/projected/d527652d-7e68-4494-9e69-75296fb63932-kube-api-access-4s8rx\") pod \"placement-operator-controller-manager-78f8948974-7qnmh\" (UID: \"d527652d-7e68-4494-9e69-75296fb63932\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.740070 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.760699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhrcq\" (UniqueName: \"kubernetes.io/projected/3dcfb91f-b0e5-4087-99cb-09cec4cd5f72-kube-api-access-qhrcq\") pod \"telemetry-operator-controller-manager-76cc84c6bb-dq9jz\" (UID: \"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.760779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xd67\" (UniqueName: \"kubernetes.io/projected/63d2cbce-2ef0-43e7-83da-8350663cf0c0-kube-api-access-4xd67\") pod \"swift-operator-controller-manager-5f8c65bbfc-wgsw2\" (UID: \"63d2cbce-2ef0-43e7-83da-8350663cf0c0\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.778801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.791078 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.798101 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.798059 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.801002 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.802519 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.803755 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hfzjl" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.823925 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.833165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.841126 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.870487 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.870686 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.873783 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-r42xl" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.878931 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.880355 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.900725 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.900991 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xvvzz" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.901122 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.901798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.933309 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mbh\" (UniqueName: \"kubernetes.io/projected/8e79701a-e50d-4782-aca7-82ce16bbfa7e-kube-api-access-z4mbh\") pod \"test-operator-controller-manager-5854674fcc-pg7qg\" (UID: \"8e79701a-e50d-4782-aca7-82ce16bbfa7e\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.955582 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj"] Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.957592 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.960674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7fdg4" Dec 01 10:45:31 crc kubenswrapper[4909]: I1201 10:45:31.965001 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.032859 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.035739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldhxp\" (UniqueName: \"kubernetes.io/projected/b78befb3-d754-4c59-b0fd-d5dcb9e06588-kube-api-access-ldhxp\") pod \"watcher-operator-controller-manager-769dc69bc-xbhwr\" (UID: \"b78befb3-d754-4c59-b0fd-d5dcb9e06588\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.035803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mbh\" (UniqueName: \"kubernetes.io/projected/8e79701a-e50d-4782-aca7-82ce16bbfa7e-kube-api-access-z4mbh\") pod \"test-operator-controller-manager-5854674fcc-pg7qg\" (UID: \"8e79701a-e50d-4782-aca7-82ce16bbfa7e\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.035851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.035956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ldb\" (UniqueName: \"kubernetes.io/projected/38909617-5f76-49e1-a3ad-0c3917fddb55-kube-api-access-92ldb\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.036006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.049557 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.059208 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.073151 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.075996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mbh\" (UniqueName: \"kubernetes.io/projected/8e79701a-e50d-4782-aca7-82ce16bbfa7e-kube-api-access-z4mbh\") pod \"test-operator-controller-manager-5854674fcc-pg7qg\" (UID: \"8e79701a-e50d-4782-aca7-82ce16bbfa7e\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.082788 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc6b6fc4_e26d_4401_ba4f_f87c3a5b503c.slice/crio-6fb94724900039ff86e6a6634630bca8a106762e2eeda93f232e77c4453f0cb9 WatchSource:0}: Error finding container 6fb94724900039ff86e6a6634630bca8a106762e2eeda93f232e77c4453f0cb9: Status 404 returned error can't find the container with id 6fb94724900039ff86e6a6634630bca8a106762e2eeda93f232e77c4453f0cb9 Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137132 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldhxp\" (UniqueName: \"kubernetes.io/projected/b78befb3-d754-4c59-b0fd-d5dcb9e06588-kube-api-access-ldhxp\") pod \"watcher-operator-controller-manager-769dc69bc-xbhwr\" (UID: \"b78befb3-d754-4c59-b0fd-d5dcb9e06588\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137242 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ldb\" (UniqueName: \"kubernetes.io/projected/38909617-5f76-49e1-a3ad-0c3917fddb55-kube-api-access-92ldb\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmcfv\" (UniqueName: \"kubernetes.io/projected/02f59677-4f80-4d7d-8921-08e112fadbc2-kube-api-access-wmcfv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nmswj\" (UID: \"02f59677-4f80-4d7d-8921-08e112fadbc2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137355 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.137387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.137518 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.137562 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:33.137549357 +0000 UTC m=+850.372020245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.138091 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.138120 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:32.638111835 +0000 UTC m=+849.872582733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.138430 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.138457 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:32.638448364 +0000 UTC m=+849.872919262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.147334 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.169503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ldb\" (UniqueName: \"kubernetes.io/projected/38909617-5f76-49e1-a3ad-0c3917fddb55-kube-api-access-92ldb\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.176391 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldhxp\" (UniqueName: \"kubernetes.io/projected/b78befb3-d754-4c59-b0fd-d5dcb9e06588-kube-api-access-ldhxp\") pod \"watcher-operator-controller-manager-769dc69bc-xbhwr\" (UID: \"b78befb3-d754-4c59-b0fd-d5dcb9e06588\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.238948 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmcfv\" (UniqueName: \"kubernetes.io/projected/02f59677-4f80-4d7d-8921-08e112fadbc2-kube-api-access-wmcfv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nmswj\" (UID: \"02f59677-4f80-4d7d-8921-08e112fadbc2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.276856 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmcfv\" (UniqueName: \"kubernetes.io/projected/02f59677-4f80-4d7d-8921-08e112fadbc2-kube-api-access-wmcfv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nmswj\" (UID: \"02f59677-4f80-4d7d-8921-08e112fadbc2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.361334 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.368492 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.386327 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.411450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" event={"ID":"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c","Type":"ContainerStarted","Data":"6fb94724900039ff86e6a6634630bca8a106762e2eeda93f232e77c4453f0cb9"} Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.426320 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" event={"ID":"1cedb704-1d3a-4c68-b14e-00bc65309891","Type":"ContainerStarted","Data":"382a042e793824e2a039e0b87d8a2b69f4053a294b095a229bf2580ac7737708"} Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.441335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" event={"ID":"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1","Type":"ContainerStarted","Data":"21a04a18311b00945250cebde62b71431846673afa9fc5a53edccfb3c799b580"} Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.515554 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.545053 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.560081 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.571276 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n"] Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.576558 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod510857ee_52e0_4d2c_8fff_2cc3dabb0dfa.slice/crio-4216b9a70a6b331da9daa70d456ca666eac29ae26867958e7e29343b1afdd762 WatchSource:0}: Error finding container 4216b9a70a6b331da9daa70d456ca666eac29ae26867958e7e29343b1afdd762: Status 404 returned error can't find the container with id 4216b9a70a6b331da9daa70d456ca666eac29ae26867958e7e29343b1afdd762 Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.593959 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cbe8fe9_4f7c_4ba7_b6c3_ff8a074fcc41.slice/crio-6cc6fb9dc645c49e483889f9ca4e6e36144196832e17d294bd6feeaccc3b21be WatchSource:0}: Error finding container 6cc6fb9dc645c49e483889f9ca4e6e36144196832e17d294bd6feeaccc3b21be: Status 404 returned error can't find the container with id 6cc6fb9dc645c49e483889f9ca4e6e36144196832e17d294bd6feeaccc3b21be Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.620918 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.648947 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.658475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.658554 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.658714 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.658777 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:33.658756426 +0000 UTC m=+850.893227324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.659207 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.659238 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:33.6592303 +0000 UTC m=+850.893701198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.762577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.762790 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: E1201 10:45:32.762854 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert podName:246fed43-c776-4c8c-a6ac-1f4364191266 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:34.762834318 +0000 UTC m=+851.997305216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert") pod "infra-operator-controller-manager-57548d458d-wbqsz" (UID: "246fed43-c776-4c8c-a6ac-1f4364191266") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.767514 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.774161 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.793848 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp"] Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.794025 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1199cd_df49_4078_b2ab_211f902bd097.slice/crio-2c97134e3ba6a9f04268e5548f3a4d9019fb01f55552d14a1b61aec48f476b9f WatchSource:0}: Error finding container 2c97134e3ba6a9f04268e5548f3a4d9019fb01f55552d14a1b61aec48f476b9f: Status 404 returned error can't find the container with id 2c97134e3ba6a9f04268e5548f3a4d9019fb01f55552d14a1b61aec48f476b9f Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.923025 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh"] Dec 01 10:45:32 crc kubenswrapper[4909]: I1201 10:45:32.934678 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g"] Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.976981 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f475b0_ff8d_4438_8c91_c41ce79ea8d1.slice/crio-0e03349891e9829bc230d91cab66de64e0decb1da8582ccba9340868d7d60d31 WatchSource:0}: Error finding container 0e03349891e9829bc230d91cab66de64e0decb1da8582ccba9340868d7d60d31: Status 404 returned error can't find the container with id 0e03349891e9829bc230d91cab66de64e0decb1da8582ccba9340868d7d60d31 Dec 01 10:45:32 crc kubenswrapper[4909]: W1201 10:45:32.979011 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd527652d_7e68_4494_9e69_75296fb63932.slice/crio-83e1d3f93ad83027f4ecd6117dda191f4b0b83639c9592af98b612eede86693f WatchSource:0}: Error finding container 83e1d3f93ad83027f4ecd6117dda191f4b0b83639c9592af98b612eede86693f: Status 404 returned error can't find the container with id 83e1d3f93ad83027f4ecd6117dda191f4b0b83639c9592af98b612eede86693f Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.002544 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4s8rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7qnmh_openstack-operators(d527652d-7e68-4494-9e69-75296fb63932): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.010059 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4s8rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-7qnmh_openstack-operators(d527652d-7e68-4494-9e69-75296fb63932): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.011248 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" podUID="d527652d-7e68-4494-9e69-75296fb63932" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.035742 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2"] Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.044468 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr"] Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.052372 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j"] Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.054062 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmcfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-nmswj_openstack-operators(02f59677-4f80-4d7d-8921-08e112fadbc2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.055592 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" podUID="02f59677-4f80-4d7d-8921-08e112fadbc2" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.061726 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ldhxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xbhwr_openstack-operators(b78befb3-d754-4c59-b0fd-d5dcb9e06588): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.062205 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xd67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-wgsw2_openstack-operators(63d2cbce-2ef0-43e7-83da-8350663cf0c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.062268 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj"] Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.065469 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xd67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-wgsw2_openstack-operators(63d2cbce-2ef0-43e7-83da-8350663cf0c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.065632 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ldhxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xbhwr_openstack-operators(b78befb3-d754-4c59-b0fd-d5dcb9e06588): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.066834 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" podUID="b78befb3-d754-4c59-b0fd-d5dcb9e06588" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.066906 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" podUID="63d2cbce-2ef0-43e7-83da-8350663cf0c0" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.067648 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg"] Dec 01 10:45:33 crc kubenswrapper[4909]: W1201 10:45:33.072734 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577bc508_a3f5_4b97_9360_12b1dfee3890.slice/crio-bdc0bdbc1de032a4c157b3ab73f16ce6822b02721d4e3e208be3f9e7c1a7e687 WatchSource:0}: Error finding container bdc0bdbc1de032a4c157b3ab73f16ce6822b02721d4e3e208be3f9e7c1a7e687: Status 404 returned error can't find the container with id bdc0bdbc1de032a4c157b3ab73f16ce6822b02721d4e3e208be3f9e7c1a7e687 Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.076711 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q78d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-pzh2j_openstack-operators(577bc508-a3f5-4b97-9360-12b1dfee3890): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.078852 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q78d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-pzh2j_openstack-operators(577bc508-a3f5-4b97-9360-12b1dfee3890): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.080119 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" podUID="577bc508-a3f5-4b97-9360-12b1dfee3890" Dec 01 10:45:33 crc kubenswrapper[4909]: W1201 10:45:33.087529 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e79701a_e50d_4782_aca7_82ce16bbfa7e.slice/crio-4dd8fdf02dd7f59eb7033e0adcafc6b21b6ebf8e9b65e3c40223205cb12ed184 WatchSource:0}: Error finding container 4dd8fdf02dd7f59eb7033e0adcafc6b21b6ebf8e9b65e3c40223205cb12ed184: Status 404 returned error can't find the container with id 4dd8fdf02dd7f59eb7033e0adcafc6b21b6ebf8e9b65e3c40223205cb12ed184 Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.091296 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z4mbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-pg7qg_openstack-operators(8e79701a-e50d-4782-aca7-82ce16bbfa7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.094003 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z4mbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-pg7qg_openstack-operators(8e79701a-e50d-4782-aca7-82ce16bbfa7e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.095407 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" podUID="8e79701a-e50d-4782-aca7-82ce16bbfa7e" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.174242 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.174429 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.174499 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:35.174479203 +0000 UTC m=+852.408950101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.480832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" event={"ID":"63d2cbce-2ef0-43e7-83da-8350663cf0c0","Type":"ContainerStarted","Data":"d85dc14b59e80d7e504df849fefc3c97eb2c5a24dace4ddc9089933f23aefaf7"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.485689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" event={"ID":"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41","Type":"ContainerStarted","Data":"6cc6fb9dc645c49e483889f9ca4e6e36144196832e17d294bd6feeaccc3b21be"} Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.499894 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" podUID="63d2cbce-2ef0-43e7-83da-8350663cf0c0" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.503191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" event={"ID":"8a1199cd-df49-4078-b2ab-211f902bd097","Type":"ContainerStarted","Data":"2c97134e3ba6a9f04268e5548f3a4d9019fb01f55552d14a1b61aec48f476b9f"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.522688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" event={"ID":"949868bd-22ef-467f-b767-2d6dbd02dab1","Type":"ContainerStarted","Data":"4ae712269933a87e114f5008ce5bbb1f414594265f81c0ba49c50b77cae4bacd"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.526310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" event={"ID":"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa","Type":"ContainerStarted","Data":"4216b9a70a6b331da9daa70d456ca666eac29ae26867958e7e29343b1afdd762"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.531050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" event={"ID":"51f475b0-ff8d-4438-8c91-c41ce79ea8d1","Type":"ContainerStarted","Data":"0e03349891e9829bc230d91cab66de64e0decb1da8582ccba9340868d7d60d31"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.533751 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" event={"ID":"67a7fb10-11c3-4fb1-90c7-ea30b122719f","Type":"ContainerStarted","Data":"dbd73204a04dc4bd20da97d32feab4cb31f788b2abdd4393a29fd2d77b143191"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.549370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" event={"ID":"b78befb3-d754-4c59-b0fd-d5dcb9e06588","Type":"ContainerStarted","Data":"10660eaccdc3c55d7362135a17a224742b49f7c042355dc69fb23926a0acf10b"} Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.553690 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" podUID="b78befb3-d754-4c59-b0fd-d5dcb9e06588" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.553725 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" event={"ID":"d527652d-7e68-4494-9e69-75296fb63932","Type":"ContainerStarted","Data":"83e1d3f93ad83027f4ecd6117dda191f4b0b83639c9592af98b612eede86693f"} Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.556636 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" podUID="d527652d-7e68-4494-9e69-75296fb63932" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.557640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" event={"ID":"8e79701a-e50d-4782-aca7-82ce16bbfa7e","Type":"ContainerStarted","Data":"4dd8fdf02dd7f59eb7033e0adcafc6b21b6ebf8e9b65e3c40223205cb12ed184"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.566216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" event={"ID":"02f59677-4f80-4d7d-8921-08e112fadbc2","Type":"ContainerStarted","Data":"c41d7836788382435909c886f54579782b18c32ba5322cc8ba3f0bdccc47d0a1"} Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.566496 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" podUID="8e79701a-e50d-4782-aca7-82ce16bbfa7e" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.579469 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" podUID="02f59677-4f80-4d7d-8921-08e112fadbc2" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.587436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" event={"ID":"577bc508-a3f5-4b97-9360-12b1dfee3890","Type":"ContainerStarted","Data":"bdc0bdbc1de032a4c157b3ab73f16ce6822b02721d4e3e208be3f9e7c1a7e687"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.589210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" event={"ID":"cfac3bef-3c49-47ba-94be-d267a57f731a","Type":"ContainerStarted","Data":"b77a0e05315423c09fc724dd03c0429ab5081eb57b2c947a53ca8e14283a663e"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.592420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" event={"ID":"79de64b2-dbd0-4877-9556-8fb199c9786c","Type":"ContainerStarted","Data":"03f1f8945c2240ee082fb83cc5432f50e7882e610010804e233c26326f662ab7"} Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.592745 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" podUID="577bc508-a3f5-4b97-9360-12b1dfee3890" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.599222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" event={"ID":"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72","Type":"ContainerStarted","Data":"2bf733ac36d10d803e54da41b35ef31e29164124dc0dc8dc3a920cab3226f6dd"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.600980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" event={"ID":"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45","Type":"ContainerStarted","Data":"775be1b33b899a3c4c7b695fc0bb5a76d5aecebeb9c4d79d46112d26b4b47dc2"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.604396 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" event={"ID":"282ee9d9-b6ee-4dd7-bce4-419428bd744b","Type":"ContainerStarted","Data":"358e17ce8b70e4a43bf367ee92ae86c8c73633622a4d8dc48efd3bd371433887"} Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.689838 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:33 crc kubenswrapper[4909]: I1201 10:45:33.689973 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.690545 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.690610 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:35.690589813 +0000 UTC m=+852.925060711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.691279 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:33 crc kubenswrapper[4909]: E1201 10:45:33.691366 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:35.691345997 +0000 UTC m=+852.925816895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.625554 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" podUID="02f59677-4f80-4d7d-8921-08e112fadbc2" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.625753 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" podUID="8e79701a-e50d-4782-aca7-82ce16bbfa7e" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.627677 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" podUID="63d2cbce-2ef0-43e7-83da-8350663cf0c0" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.628355 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" podUID="577bc508-a3f5-4b97-9360-12b1dfee3890" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.628465 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" podUID="d527652d-7e68-4494-9e69-75296fb63932" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.628587 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" podUID="b78befb3-d754-4c59-b0fd-d5dcb9e06588" Dec 01 10:45:34 crc kubenswrapper[4909]: I1201 10:45:34.820771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.821246 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:34 crc kubenswrapper[4909]: E1201 10:45:34.821467 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert podName:246fed43-c776-4c8c-a6ac-1f4364191266 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:38.821416956 +0000 UTC m=+856.055887844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert") pod "infra-operator-controller-manager-57548d458d-wbqsz" (UID: "246fed43-c776-4c8c-a6ac-1f4364191266") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: I1201 10:45:35.228783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.229230 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.229349 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:39.229328636 +0000 UTC m=+856.463799534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: I1201 10:45:35.736920 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:35 crc kubenswrapper[4909]: I1201 10:45:35.737013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.737230 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.737332 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:39.737301563 +0000 UTC m=+856.971772461 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.737836 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:35 crc kubenswrapper[4909]: E1201 10:45:35.737889 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:39.737864091 +0000 UTC m=+856.972334989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:38 crc kubenswrapper[4909]: I1201 10:45:38.830602 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:38 crc kubenswrapper[4909]: E1201 10:45:38.830837 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:38 crc kubenswrapper[4909]: E1201 10:45:38.831207 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert podName:246fed43-c776-4c8c-a6ac-1f4364191266 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:46.831177397 +0000 UTC m=+864.065648295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert") pod "infra-operator-controller-manager-57548d458d-wbqsz" (UID: "246fed43-c776-4c8c-a6ac-1f4364191266") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: I1201 10:45:39.235345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.235516 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.235691 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:47.23567479 +0000 UTC m=+864.470145688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: I1201 10:45:39.745772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:39 crc kubenswrapper[4909]: I1201 10:45:39.745968 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.745978 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.746066 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:47.746042052 +0000 UTC m=+864.980512950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.746247 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:39 crc kubenswrapper[4909]: E1201 10:45:39.746329 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:45:47.746302791 +0000 UTC m=+864.980773879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:46 crc kubenswrapper[4909]: E1201 10:45:46.191772 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 10:45:46 crc kubenswrapper[4909]: E1201 10:45:46.192741 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74qpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-nxbp4_openstack-operators(949868bd-22ef-467f-b767-2d6dbd02dab1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:46 crc kubenswrapper[4909]: I1201 10:45:46.869307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:46 crc kubenswrapper[4909]: I1201 10:45:46.877116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/246fed43-c776-4c8c-a6ac-1f4364191266-cert\") pod \"infra-operator-controller-manager-57548d458d-wbqsz\" (UID: \"246fed43-c776-4c8c-a6ac-1f4364191266\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:46 crc kubenswrapper[4909]: E1201 10:45:46.951719 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5" Dec 01 10:45:46 crc kubenswrapper[4909]: E1201 10:45:46.952431 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cn9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-twgwh_openstack-operators(6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:47 crc kubenswrapper[4909]: I1201 10:45:47.113650 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:45:47 crc kubenswrapper[4909]: I1201 10:45:47.282236 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.282468 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.282921 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert podName:c1265544-727e-4a41-a2e4-c612230cbbc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:46:03.282894223 +0000 UTC m=+880.517365121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" (UID: "c1265544-727e-4a41-a2e4-c612230cbbc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: I1201 10:45:47.791421 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:47 crc kubenswrapper[4909]: I1201 10:45:47.791548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.791895 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.792024 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:46:03.791996935 +0000 UTC m=+881.026467833 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "webhook-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.792166 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.792281 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs podName:38909617-5f76-49e1-a3ad-0c3917fddb55 nodeName:}" failed. No retries permitted until 2025-12-01 10:46:03.792252784 +0000 UTC m=+881.026723682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs") pod "openstack-operator-controller-manager-7884db5fb-nljsq" (UID: "38909617-5f76-49e1-a3ad-0c3917fddb55") : secret "metrics-server-cert" not found Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.834902 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 01 10:45:47 crc kubenswrapper[4909]: E1201 10:45:47.835168 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhrcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-dq9jz_openstack-operators(3dcfb91f-b0e5-4087-99cb-09cec4cd5f72): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:48 crc kubenswrapper[4909]: E1201 10:45:48.978582 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 01 10:45:48 crc kubenswrapper[4909]: E1201 10:45:48.978814 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vpcrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-9txzx_openstack-operators(cfac3bef-3c49-47ba-94be-d267a57f731a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:55 crc kubenswrapper[4909]: E1201 10:45:55.290710 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 01 10:45:55 crc kubenswrapper[4909]: E1201 10:45:55.291921 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bb62h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-z98b8_openstack-operators(67a7fb10-11c3-4fb1-90c7-ea30b122719f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:55 crc kubenswrapper[4909]: E1201 10:45:55.994266 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 10:45:55 crc kubenswrapper[4909]: E1201 10:45:55.994496 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdrtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-q7zr4_openstack-operators(79de64b2-dbd0-4877-9556-8fb199c9786c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.091406 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.094451 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.101155 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.256556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpqr8\" (UniqueName: \"kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.256625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.256701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.357753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpqr8\" (UniqueName: \"kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.357842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.357934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.358779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.359111 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.389092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpqr8\" (UniqueName: \"kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8\") pod \"redhat-marketplace-zvrd7\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:45:57 crc kubenswrapper[4909]: I1201 10:45:57.425140 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.026426 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.058589 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.068798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.209317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.209432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vfl\" (UniqueName: \"kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.209546 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.310766 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.310835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.310898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vfl\" (UniqueName: \"kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.311402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.311588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.334252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vfl\" (UniqueName: \"kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl\") pod \"community-operators-q62kw\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:00 crc kubenswrapper[4909]: I1201 10:46:00.388452 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:01 crc kubenswrapper[4909]: I1201 10:46:01.603212 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz"] Dec 01 10:46:02 crc kubenswrapper[4909]: I1201 10:46:02.719008 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:02 crc kubenswrapper[4909]: W1201 10:46:02.816046 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e96450_bac5_45cf_94b2_01ad0d4a267e.slice/crio-88a3898068de95cf9985cb17e5dff4700b37308afcdc67bb871e3bc0b991205a WatchSource:0}: Error finding container 88a3898068de95cf9985cb17e5dff4700b37308afcdc67bb871e3bc0b991205a: Status 404 returned error can't find the container with id 88a3898068de95cf9985cb17e5dff4700b37308afcdc67bb871e3bc0b991205a Dec 01 10:46:02 crc kubenswrapper[4909]: I1201 10:46:02.850323 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerStarted","Data":"88a3898068de95cf9985cb17e5dff4700b37308afcdc67bb871e3bc0b991205a"} Dec 01 10:46:02 crc kubenswrapper[4909]: I1201 10:46:02.854449 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" event={"ID":"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa","Type":"ContainerStarted","Data":"8600690d18c65ba2904103bc0ae9bdad4266fca2c8c4e1ac02c7ac786233e91a"} Dec 01 10:46:02 crc kubenswrapper[4909]: I1201 10:46:02.859618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" event={"ID":"246fed43-c776-4c8c-a6ac-1f4364191266","Type":"ContainerStarted","Data":"3c6e1bd1ee88a18344f2271e7303f99266d8bc7aabd824ac1926da1b1ade3a92"} Dec 01 10:46:02 crc kubenswrapper[4909]: I1201 10:46:02.979277 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:46:03 crc kubenswrapper[4909]: W1201 10:46:03.096688 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b8ef6df_1ab3_4c5c_a232_730700d94d24.slice/crio-6652a3c35806ad841089498acd7beadeda33dadb32bd7e4422cd072b7c023e63 WatchSource:0}: Error finding container 6652a3c35806ad841089498acd7beadeda33dadb32bd7e4422cd072b7c023e63: Status 404 returned error can't find the container with id 6652a3c35806ad841089498acd7beadeda33dadb32bd7e4422cd072b7c023e63 Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.361455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.401267 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1265544-727e-4a41-a2e4-c612230cbbc0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l\" (UID: \"c1265544-727e-4a41-a2e4-c612230cbbc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.527305 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.874687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.874773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.883290 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-metrics-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.883325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38909617-5f76-49e1-a3ad-0c3917fddb55-webhook-certs\") pod \"openstack-operator-controller-manager-7884db5fb-nljsq\" (UID: \"38909617-5f76-49e1-a3ad-0c3917fddb55\") " pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.887002 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" event={"ID":"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41","Type":"ContainerStarted","Data":"11140410d17d8cfead9321c70a18ae695ea38fad9891f52094b5efc2c4f74b40"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.899244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" event={"ID":"1cedb704-1d3a-4c68-b14e-00bc65309891","Type":"ContainerStarted","Data":"65f0a5912ae17fd975b076d663760655bc6295ba68cade780684b3eda398bbba"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.901263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" event={"ID":"8a1199cd-df49-4078-b2ab-211f902bd097","Type":"ContainerStarted","Data":"5dd3c3afd7d7979158b786cff4862d262a2c8ca7c5cf31a1ae180a19c3caa128"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.903379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" event={"ID":"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1","Type":"ContainerStarted","Data":"3198c235e35792763bee517912f5813bce407c139df38af1aedbc449444acd15"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.904754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerStarted","Data":"6652a3c35806ad841089498acd7beadeda33dadb32bd7e4422cd072b7c023e63"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.910472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" event={"ID":"282ee9d9-b6ee-4dd7-bce4-419428bd744b","Type":"ContainerStarted","Data":"224c18aef73ec70b5a657975b250d2e37edf392a26445d0cb382245cc8b4cad0"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.913486 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" event={"ID":"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c","Type":"ContainerStarted","Data":"21ca88a15600084cc329eeccc4958bfe4618851804697275f3aa564bb262f530"} Dec 01 10:46:03 crc kubenswrapper[4909]: I1201 10:46:03.915106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" event={"ID":"51f475b0-ff8d-4438-8c91-c41ce79ea8d1","Type":"ContainerStarted","Data":"6a8cafcb7dab9f38380fdad2057dbe1f26228902548e90ce46cdcb0ab5ad08c9"} Dec 01 10:46:04 crc kubenswrapper[4909]: I1201 10:46:04.159514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:04 crc kubenswrapper[4909]: E1201 10:46:04.773638 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/ca/ca79f5a8ee31679eabfc056fe1d23792246706ca8994e1f5752a91e6369bb6bc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104603Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=89a2e8fcb11f6ddbfdafd418ed242d490efd4a5e05c4322c16b90125093986d7®ion=us-east-1&namespace=openstack-k8s-operators&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=kube-rbac-proxy&akamai_signature=exp=1764586863~hmac=fd98e432688981a9090996a01c9a231bd09332a1ffec0f618a9ad66a4f2726ab\": remote error: tls: internal error" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 10:46:04 crc kubenswrapper[4909]: E1201 10:46:04.773826 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q6tgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-8c7f494db-q9n6n_openstack-operators(9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/ca/ca79f5a8ee31679eabfc056fe1d23792246706ca8994e1f5752a91e6369bb6bc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104603Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=89a2e8fcb11f6ddbfdafd418ed242d490efd4a5e05c4322c16b90125093986d7®ion=us-east-1&namespace=openstack-k8s-operators&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=kube-rbac-proxy&akamai_signature=exp=1764586863~hmac=fd98e432688981a9090996a01c9a231bd09332a1ffec0f618a9ad66a4f2726ab\": remote error: tls: internal error" logger="UnhandledError" Dec 01 10:46:04 crc kubenswrapper[4909]: E1201 10:46:04.776409 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/ca/ca79f5a8ee31679eabfc056fe1d23792246706ca8994e1f5752a91e6369bb6bc?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T104603Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=89a2e8fcb11f6ddbfdafd418ed242d490efd4a5e05c4322c16b90125093986d7®ion=us-east-1&namespace=openstack-k8s-operators&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=kube-rbac-proxy&akamai_signature=exp=1764586863~hmac=fd98e432688981a9090996a01c9a231bd09332a1ffec0f618a9ad66a4f2726ab\\\": remote error: tls: internal error\"" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" podUID="9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41" Dec 01 10:46:04 crc kubenswrapper[4909]: I1201 10:46:04.943703 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" event={"ID":"d527652d-7e68-4494-9e69-75296fb63932","Type":"ContainerStarted","Data":"4e7d2bc9bf820a62f4949d381b0e4e9f09b79ef5719f4126c44b6e5c3caefbe7"} Dec 01 10:46:04 crc kubenswrapper[4909]: I1201 10:46:04.944324 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:46:04 crc kubenswrapper[4909]: E1201 10:46:04.945697 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" podUID="9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41" Dec 01 10:46:05 crc kubenswrapper[4909]: I1201 10:46:05.952453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" event={"ID":"577bc508-a3f5-4b97-9360-12b1dfee3890","Type":"ContainerStarted","Data":"7a99d18712445c0a245267ef95964d90a973d9e847f5c430384056404e24eba0"} Dec 01 10:46:05 crc kubenswrapper[4909]: E1201 10:46:05.954275 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" podUID="9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41" Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.194210 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.194603 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.457439 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l"] Dec 01 10:46:06 crc kubenswrapper[4909]: E1201 10:46:06.959412 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" podUID="cfac3bef-3c49-47ba-94be-d267a57f731a" Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.982271 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" event={"ID":"b78befb3-d754-4c59-b0fd-d5dcb9e06588","Type":"ContainerStarted","Data":"7335326daea10187040aed2da6b79d37d4dcf12651ae91e8dcce14abe656abc9"} Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.986508 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" event={"ID":"246fed43-c776-4c8c-a6ac-1f4364191266","Type":"ContainerStarted","Data":"91290d175ab7eb16f5e962f7212b3212bbc1133ea5aab067f6274051badc9bc9"} Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.994521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" event={"ID":"cfac3bef-3c49-47ba-94be-d267a57f731a","Type":"ContainerStarted","Data":"eb9b083491fbdc2d6fa54c5d4767bb5833acc0aaf7b4036f5e18612902ada921"} Dec 01 10:46:06 crc kubenswrapper[4909]: I1201 10:46:06.998564 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" event={"ID":"c1265544-727e-4a41-a2e4-c612230cbbc0","Type":"ContainerStarted","Data":"d075129753a855a9a10bc6c43dfcf16889835bb97e3ac40c5796c07f718b6c47"} Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.004123 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq"] Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.007387 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerID="fc14379c46d990db6d092d36b34bfd0ab730733408cdb4c3ef155e67b5a7e092" exitCode=0 Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.008254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerDied","Data":"fc14379c46d990db6d092d36b34bfd0ab730733408cdb4c3ef155e67b5a7e092"} Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.017995 4909 generic.go:334] "Generic (PLEG): container finished" podID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerID="e2b333a0161d10b2f4203255018d22e410f835197cd4af715f3565b77be969ab" exitCode=0 Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.018057 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerDied","Data":"e2b333a0161d10b2f4203255018d22e410f835197cd4af715f3565b77be969ab"} Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.024327 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" event={"ID":"02f59677-4f80-4d7d-8921-08e112fadbc2","Type":"ContainerStarted","Data":"45c82ba24a23a735fd7d4e58f6f46b64b1a64e463344edca8abafe3fc49ad6ff"} Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.063766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" event={"ID":"63d2cbce-2ef0-43e7-83da-8350663cf0c0","Type":"ContainerStarted","Data":"e07d19359952342957b14fd47e0dca3ec015a7ff33417638309bbc2ae302a168"} Dec 01 10:46:07 crc kubenswrapper[4909]: E1201 10:46:07.064073 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" podUID="949868bd-22ef-467f-b767-2d6dbd02dab1" Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.064525 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.071325 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" event={"ID":"8e79701a-e50d-4782-aca7-82ce16bbfa7e","Type":"ContainerStarted","Data":"0c68dd67690f967eb2507e8fb5594c49a321f246ce7e82ce1a2e82af03219f1b"} Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.140057 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nmswj" podStartSLOduration=6.589966332 podStartE2EDuration="36.140033181s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.053843356 +0000 UTC m=+850.288314254" lastFinishedPulling="2025-12-01 10:46:02.603910195 +0000 UTC m=+879.838381103" observedRunningTime="2025-12-01 10:46:07.13474451 +0000 UTC m=+884.369215418" watchObservedRunningTime="2025-12-01 10:46:07.140033181 +0000 UTC m=+884.374504079" Dec 01 10:46:07 crc kubenswrapper[4909]: I1201 10:46:07.155220 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" podStartSLOduration=6.806692904 podStartE2EDuration="36.155204591s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.062136125 +0000 UTC m=+850.296607023" lastFinishedPulling="2025-12-01 10:46:02.410647812 +0000 UTC m=+879.645118710" observedRunningTime="2025-12-01 10:46:07.15425727 +0000 UTC m=+884.388728168" watchObservedRunningTime="2025-12-01 10:46:07.155204591 +0000 UTC m=+884.389675489" Dec 01 10:46:07 crc kubenswrapper[4909]: E1201 10:46:07.257655 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" podUID="79de64b2-dbd0-4877-9556-8fb199c9786c" Dec 01 10:46:07 crc kubenswrapper[4909]: E1201 10:46:07.549009 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" podUID="67a7fb10-11c3-4fb1-90c7-ea30b122719f" Dec 01 10:46:07 crc kubenswrapper[4909]: E1201 10:46:07.701558 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" podUID="6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45" Dec 01 10:46:08 crc kubenswrapper[4909]: E1201 10:46:08.045276 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" podUID="3dcfb91f-b0e5-4087-99cb-09cec4cd5f72" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.096024 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" event={"ID":"246fed43-c776-4c8c-a6ac-1f4364191266","Type":"ContainerStarted","Data":"a229bcf2587a34b4adae78a74a8ba2fc4fe818927f69bef3331c5a0b5dfe0fae"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.097000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.134377 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" event={"ID":"79de64b2-dbd0-4877-9556-8fb199c9786c","Type":"ContainerStarted","Data":"ee92d35d0959df1f109c2ac7afa897e1370138d06620b5173ebdd747ef14efc9"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.159661 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" podStartSLOduration=33.657249172 podStartE2EDuration="38.15964601s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:46:01.87900583 +0000 UTC m=+879.113476728" lastFinishedPulling="2025-12-01 10:46:06.381402668 +0000 UTC m=+883.615873566" observedRunningTime="2025-12-01 10:46:08.141211745 +0000 UTC m=+885.375682643" watchObservedRunningTime="2025-12-01 10:46:08.15964601 +0000 UTC m=+885.394116908" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.167035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" event={"ID":"949868bd-22ef-467f-b767-2d6dbd02dab1","Type":"ContainerStarted","Data":"8522da37624ef394026be4c59588deabcb983af94f4e52f605ef29a51f827b69"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.201635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" event={"ID":"510857ee-52e0-4d2c-8fff-2cc3dabb0dfa","Type":"ContainerStarted","Data":"303c7d2f9c1e40697e2b9ca0056757b3a03c5f9405ee023fb468d6f17aa2cc93"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.204955 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.231463 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.242647 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2rz9d" podStartSLOduration=4.232966852 podStartE2EDuration="38.242624779s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.583946272 +0000 UTC m=+849.818417170" lastFinishedPulling="2025-12-01 10:46:06.593604209 +0000 UTC m=+883.828075097" observedRunningTime="2025-12-01 10:46:08.242102342 +0000 UTC m=+885.476573240" watchObservedRunningTime="2025-12-01 10:46:08.242624779 +0000 UTC m=+885.477095667" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.247587 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" event={"ID":"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45","Type":"ContainerStarted","Data":"e00b013b65d817c9c7f9899f6c803c123304f3459fa25195dbd8936cd1c9587c"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.285772 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" event={"ID":"19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1","Type":"ContainerStarted","Data":"2ba90052989ee6788612278fe39e8ec46829a8abfdee4ef23bbc699f05f09388"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.288061 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.296543 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.320212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" event={"ID":"8e79701a-e50d-4782-aca7-82ce16bbfa7e","Type":"ContainerStarted","Data":"455451b295bb06e560bbbedbca00e7dcbbc60b1baac63d0b7f54d7e7aca4017b"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.320335 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.343118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" event={"ID":"d527652d-7e68-4494-9e69-75296fb63932","Type":"ContainerStarted","Data":"c4c4665c44615496cfc3df03418e55597378e02187709844328dc6708b194083"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.343837 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.373326 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" event={"ID":"63d2cbce-2ef0-43e7-83da-8350663cf0c0","Type":"ContainerStarted","Data":"4e9668df98255fce29b9cf37220cadcd382ce611dbcc8bf6cc0d1a462d8bd4cd"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.398702 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" podStartSLOduration=7.931696129 podStartE2EDuration="37.398675967s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.091018331 +0000 UTC m=+850.325489229" lastFinishedPulling="2025-12-01 10:46:02.557998169 +0000 UTC m=+879.792469067" observedRunningTime="2025-12-01 10:46:08.382320989 +0000 UTC m=+885.616791887" watchObservedRunningTime="2025-12-01 10:46:08.398675967 +0000 UTC m=+885.633146865" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.399856 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" event={"ID":"67a7fb10-11c3-4fb1-90c7-ea30b122719f","Type":"ContainerStarted","Data":"656aa914be41112722347f3c495d029fff3069eb3aa0523ee7659133903a8733"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.409535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" event={"ID":"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72","Type":"ContainerStarted","Data":"2450d52719c98ae2e400ed3d520a57b180167010001dafe721dffd58c0ce7b3c"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.455098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" event={"ID":"38909617-5f76-49e1-a3ad-0c3917fddb55","Type":"ContainerStarted","Data":"1bb7f6d2a806e55ad037bba4dbc356fab9939c244aedf44db099a5b59e4d23ee"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.456330 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.498688 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.510662 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.510791 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.510854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" event={"ID":"577bc508-a3f5-4b97-9360-12b1dfee3890","Type":"ContainerStarted","Data":"56f04601b298a9fe3de8acba1dcc449264a419190009b92799c778e0578e49e7"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.504770 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-77s9g" podStartSLOduration=4.072825459 podStartE2EDuration="38.504739333s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.09741763 +0000 UTC m=+849.331888528" lastFinishedPulling="2025-12-01 10:46:06.529331504 +0000 UTC m=+883.763802402" observedRunningTime="2025-12-01 10:46:08.416991779 +0000 UTC m=+885.651462687" watchObservedRunningTime="2025-12-01 10:46:08.504739333 +0000 UTC m=+885.739210231" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.584936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" event={"ID":"51f475b0-ff8d-4438-8c91-c41ce79ea8d1","Type":"ContainerStarted","Data":"05594775d3145c51605e26dd08df3eda2bec4f049b7d2a9f623e8602a2cbd8e0"} Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.586423 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" podStartSLOduration=4.060715313 podStartE2EDuration="37.586400878s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.002276605 +0000 UTC m=+850.236747503" lastFinishedPulling="2025-12-01 10:46:06.52796217 +0000 UTC m=+883.762433068" observedRunningTime="2025-12-01 10:46:08.504235626 +0000 UTC m=+885.738706534" watchObservedRunningTime="2025-12-01 10:46:08.586400878 +0000 UTC m=+885.820871776" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.603352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.603634 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.606436 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" podStartSLOduration=4.160657551 podStartE2EDuration="38.606414475s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.149263161 +0000 UTC m=+849.383734059" lastFinishedPulling="2025-12-01 10:46:06.595020085 +0000 UTC m=+883.829490983" observedRunningTime="2025-12-01 10:46:08.532902171 +0000 UTC m=+885.767373089" watchObservedRunningTime="2025-12-01 10:46:08.606414475 +0000 UTC m=+885.840885373" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.753897 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" podStartSLOduration=37.753856616 podStartE2EDuration="37.753856616s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:46:08.730326346 +0000 UTC m=+885.964797244" watchObservedRunningTime="2025-12-01 10:46:08.753856616 +0000 UTC m=+885.988327514" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.798139 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" podStartSLOduration=5.11294902 podStartE2EDuration="38.798120685s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.07650085 +0000 UTC m=+850.310971748" lastFinishedPulling="2025-12-01 10:46:06.761672525 +0000 UTC m=+883.996143413" observedRunningTime="2025-12-01 10:46:08.792063769 +0000 UTC m=+886.026534667" watchObservedRunningTime="2025-12-01 10:46:08.798120685 +0000 UTC m=+886.032591573" Dec 01 10:46:08 crc kubenswrapper[4909]: I1201 10:46:08.799078 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-v585g" podStartSLOduration=5.245863859 podStartE2EDuration="38.799072335s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.979936131 +0000 UTC m=+850.214407029" lastFinishedPulling="2025-12-01 10:46:06.533144607 +0000 UTC m=+883.767615505" observedRunningTime="2025-12-01 10:46:08.753116982 +0000 UTC m=+885.987587880" watchObservedRunningTime="2025-12-01 10:46:08.799072335 +0000 UTC m=+886.033543233" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.453842 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.455420 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.475228 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.543107 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.543241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.543285 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76fc\" (UniqueName: \"kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.619782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" event={"ID":"282ee9d9-b6ee-4dd7-bce4-419428bd744b","Type":"ContainerStarted","Data":"da4a377817ea491b0eb3b645a81faa17df9bd7417ebbc166077bbde8e6b682ae"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.620622 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.624514 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.631051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" event={"ID":"79de64b2-dbd0-4877-9556-8fb199c9786c","Type":"ContainerStarted","Data":"80ddacf6acf3fdcb03227426e3a8c708d7168a0e63fb31d645f29f587c8fccbe"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.631236 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.637971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" event={"ID":"6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45","Type":"ContainerStarted","Data":"4a8a61357cd6df4f6d5f33244feb16549b770427cf730c505745a7690930c201"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.638861 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.642374 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" event={"ID":"949868bd-22ef-467f-b767-2d6dbd02dab1","Type":"ContainerStarted","Data":"f8a3a8eba0e42b9265759a2226f77746b79eaac26965aaee3990e784cbcffa87"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.643762 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.645149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.645241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76fc\" (UniqueName: \"kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.645302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.646786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.646860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.663691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" event={"ID":"cfac3bef-3c49-47ba-94be-d267a57f731a","Type":"ContainerStarted","Data":"c2fd1c27c47ec13aefdfb67bfe894a667d112991a0a9dbfe3528e72fc1b4f8a9"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.663855 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.667045 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zbkjz" podStartSLOduration=5.753474458 podStartE2EDuration="39.667021157s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.669815509 +0000 UTC m=+849.904286407" lastFinishedPulling="2025-12-01 10:46:06.583362208 +0000 UTC m=+883.817833106" observedRunningTime="2025-12-01 10:46:09.659686521 +0000 UTC m=+886.894157439" watchObservedRunningTime="2025-12-01 10:46:09.667021157 +0000 UTC m=+886.901492055" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.677268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" event={"ID":"8a1199cd-df49-4078-b2ab-211f902bd097","Type":"ContainerStarted","Data":"3dabb7e1a94db2966e003f74121990acfa398880ff30b654105a773b577d78bf"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.678594 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.680342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76fc\" (UniqueName: \"kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc\") pod \"redhat-operators-dx7kf\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.682241 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" podStartSLOduration=3.46619002 podStartE2EDuration="39.682228689s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.776775821 +0000 UTC m=+850.011246719" lastFinishedPulling="2025-12-01 10:46:08.99281449 +0000 UTC m=+886.227285388" observedRunningTime="2025-12-01 10:46:09.680226945 +0000 UTC m=+886.914697843" watchObservedRunningTime="2025-12-01 10:46:09.682228689 +0000 UTC m=+886.916699607" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.689613 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.696200 4909 generic.go:334] "Generic (PLEG): container finished" podID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerID="3b468f5bff9f5074def9ab914640ed72a87530c18a3e42672021e4b845c612ac" exitCode=0 Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.696305 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerDied","Data":"3b468f5bff9f5074def9ab914640ed72a87530c18a3e42672021e4b845c612ac"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.700177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" event={"ID":"38909617-5f76-49e1-a3ad-0c3917fddb55","Type":"ContainerStarted","Data":"7affc5726f692e59df92bedb4a910ba8eb1bd9d4b706c3702a32cfb8261716ec"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.706731 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" event={"ID":"b78befb3-d754-4c59-b0fd-d5dcb9e06588","Type":"ContainerStarted","Data":"1c1567713361035735fc2a7e844df2203dd8413abbc4460b14e145fbb4955cd5"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.706903 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.712838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" event={"ID":"dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c","Type":"ContainerStarted","Data":"0380502750c92d07e0877b78296583d2ff049eb06a0c433604d7ffd84f8ceab2"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.713821 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.717806 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" podStartSLOduration=3.318452192 podStartE2EDuration="39.717780976s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.58711407 +0000 UTC m=+849.821584968" lastFinishedPulling="2025-12-01 10:46:08.986442854 +0000 UTC m=+886.220913752" observedRunningTime="2025-12-01 10:46:09.712483425 +0000 UTC m=+886.946954323" watchObservedRunningTime="2025-12-01 10:46:09.717780976 +0000 UTC m=+886.952251874" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.718832 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.744503 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-wjqrn" event={"ID":"1cedb704-1d3a-4c68-b14e-00bc65309891","Type":"ContainerStarted","Data":"eafca2c8ac111ae999f298155ff8db7b387d88278e823189df18754c7e683bb1"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.748587 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" podStartSLOduration=3.146242806 podStartE2EDuration="39.74856777s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.648701703 +0000 UTC m=+849.883172601" lastFinishedPulling="2025-12-01 10:46:09.251026667 +0000 UTC m=+886.485497565" observedRunningTime="2025-12-01 10:46:09.744853231 +0000 UTC m=+886.979324159" watchObservedRunningTime="2025-12-01 10:46:09.74856777 +0000 UTC m=+886.983038668" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.754765 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.757079 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.774416 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerID="9b5d9c1a9688719daa3416297f2ab02d11b230a73f738d9af5d6f2f1595e6e83" exitCode=0 Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.774909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerDied","Data":"9b5d9c1a9688719daa3416297f2ab02d11b230a73f738d9af5d6f2f1595e6e83"} Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.785824 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-7qnmh" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.823672 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.872643 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hdfcg" podStartSLOduration=5.447771976 podStartE2EDuration="39.870706755s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.110144926 +0000 UTC m=+849.344615824" lastFinishedPulling="2025-12-01 10:46:06.533079705 +0000 UTC m=+883.767550603" observedRunningTime="2025-12-01 10:46:09.846562164 +0000 UTC m=+887.081033092" watchObservedRunningTime="2025-12-01 10:46:09.870706755 +0000 UTC m=+887.105177673" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.886553 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" podStartSLOduration=4.780546825 podStartE2EDuration="39.886535935s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.451033374 +0000 UTC m=+849.685504272" lastFinishedPulling="2025-12-01 10:46:07.557022484 +0000 UTC m=+884.791493382" observedRunningTime="2025-12-01 10:46:09.882201425 +0000 UTC m=+887.116672343" watchObservedRunningTime="2025-12-01 10:46:09.886535935 +0000 UTC m=+887.121006833" Dec 01 10:46:09 crc kubenswrapper[4909]: I1201 10:46:09.908012 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" podStartSLOduration=2.538261592 podStartE2EDuration="38.907991548s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.78834014 +0000 UTC m=+850.022811038" lastFinishedPulling="2025-12-01 10:46:09.158070096 +0000 UTC m=+886.392540994" observedRunningTime="2025-12-01 10:46:09.906766839 +0000 UTC m=+887.141237737" watchObservedRunningTime="2025-12-01 10:46:09.907991548 +0000 UTC m=+887.142462446" Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.007330 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fh7wp" podStartSLOduration=6.231716363 podStartE2EDuration="40.007297404s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.797217246 +0000 UTC m=+850.031688134" lastFinishedPulling="2025-12-01 10:46:06.572798277 +0000 UTC m=+883.807269175" observedRunningTime="2025-12-01 10:46:09.953915551 +0000 UTC m=+887.188386449" watchObservedRunningTime="2025-12-01 10:46:10.007297404 +0000 UTC m=+887.241768302" Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.014739 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" podStartSLOduration=3.545234251 podStartE2EDuration="40.014710463s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.565996415 +0000 UTC m=+849.800467313" lastFinishedPulling="2025-12-01 10:46:09.035472627 +0000 UTC m=+886.269943525" observedRunningTime="2025-12-01 10:46:10.002371616 +0000 UTC m=+887.236842534" watchObservedRunningTime="2025-12-01 10:46:10.014710463 +0000 UTC m=+887.249181361" Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.088339 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" podStartSLOduration=9.739254855 podStartE2EDuration="39.08831493s" podCreationTimestamp="2025-12-01 10:45:31 +0000 UTC" firstStartedPulling="2025-12-01 10:45:33.061601468 +0000 UTC m=+850.296072366" lastFinishedPulling="2025-12-01 10:46:02.410661553 +0000 UTC m=+879.645132441" observedRunningTime="2025-12-01 10:46:10.052310948 +0000 UTC m=+887.286781836" watchObservedRunningTime="2025-12-01 10:46:10.08831493 +0000 UTC m=+887.322785828" Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.477526 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.787043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" event={"ID":"67a7fb10-11c3-4fb1-90c7-ea30b122719f","Type":"ContainerStarted","Data":"6e97dd61b19b69efaee16e7d5a144d032407b492ce99bc1348b42259e356448b"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.790410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" event={"ID":"3dcfb91f-b0e5-4087-99cb-09cec4cd5f72","Type":"ContainerStarted","Data":"b5a63a4f85c8a625d1a7cffca3aa0bad0978b6d7e9fe08a1a019e752aee524bd"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.794199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerStarted","Data":"b76d6a3642507c136c285c17b4245e2c91f08924cae6f2c2d791b0617d45d16e"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.798574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerStarted","Data":"2a322948ffd7387eafd85ba2cf5ec0e7f8ce2606da11745de36b6f8dbbd38c8b"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.805722 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerStarted","Data":"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.805770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerStarted","Data":"b22f7d3135aae037baa86b40a5968ae98aceb8829e1f462eb4546e5b91382e52"} Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.813570 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zvrd7" podStartSLOduration=10.576355928 podStartE2EDuration="13.813533495s" podCreationTimestamp="2025-12-01 10:45:57 +0000 UTC" firstStartedPulling="2025-12-01 10:46:07.13382672 +0000 UTC m=+884.368297618" lastFinishedPulling="2025-12-01 10:46:10.371004287 +0000 UTC m=+887.605475185" observedRunningTime="2025-12-01 10:46:10.811663803 +0000 UTC m=+888.046134711" watchObservedRunningTime="2025-12-01 10:46:10.813533495 +0000 UTC m=+888.048004393" Dec 01 10:46:10 crc kubenswrapper[4909]: I1201 10:46:10.841432 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q62kw" podStartSLOduration=8.517536658000001 podStartE2EDuration="11.841399174s" podCreationTimestamp="2025-12-01 10:45:59 +0000 UTC" firstStartedPulling="2025-12-01 10:46:07.150182468 +0000 UTC m=+884.384653366" lastFinishedPulling="2025-12-01 10:46:10.474044994 +0000 UTC m=+887.708515882" observedRunningTime="2025-12-01 10:46:10.839552505 +0000 UTC m=+888.074023403" watchObservedRunningTime="2025-12-01 10:46:10.841399174 +0000 UTC m=+888.075870072" Dec 01 10:46:11 crc kubenswrapper[4909]: I1201 10:46:11.280982 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" Dec 01 10:46:11 crc kubenswrapper[4909]: I1201 10:46:11.806631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pzh2j" Dec 01 10:46:11 crc kubenswrapper[4909]: I1201 10:46:11.818094 4909 generic.go:334] "Generic (PLEG): container finished" podID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerID="0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f" exitCode=0 Dec 01 10:46:11 crc kubenswrapper[4909]: I1201 10:46:11.819184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerDied","Data":"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f"} Dec 01 10:46:12 crc kubenswrapper[4909]: I1201 10:46:12.041080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-wgsw2" Dec 01 10:46:12 crc kubenswrapper[4909]: I1201 10:46:12.152042 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-pg7qg" Dec 01 10:46:12 crc kubenswrapper[4909]: I1201 10:46:12.366653 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xbhwr" Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.841595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerStarted","Data":"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616"} Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.843312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" event={"ID":"9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41","Type":"ContainerStarted","Data":"8868a65ba4241cdbebfa1206f5f8b27194eb227aa1389346f05bbb58fdf3da6a"} Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.844933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" event={"ID":"c1265544-727e-4a41-a2e4-c612230cbbc0","Type":"ContainerStarted","Data":"d4d7269c410e392ba4ed9a59c8c042afde9d29ad8b2251f1fd07775daea9657f"} Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.844956 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" event={"ID":"c1265544-727e-4a41-a2e4-c612230cbbc0","Type":"ContainerStarted","Data":"8c903f52075f51ce9ed8b554c21997dc82f77d5d9eea0b3cccab2c08606a30e3"} Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.845098 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.941106 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8c7f494db-q9n6n" podStartSLOduration=16.260964411 podStartE2EDuration="43.941072571s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:45:32.615290285 +0000 UTC m=+849.849761183" lastFinishedPulling="2025-12-01 10:46:00.295398445 +0000 UTC m=+877.529869343" observedRunningTime="2025-12-01 10:46:13.88375809 +0000 UTC m=+891.118228988" watchObservedRunningTime="2025-12-01 10:46:13.941072571 +0000 UTC m=+891.175543469" Dec 01 10:46:13 crc kubenswrapper[4909]: I1201 10:46:13.941810 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" podStartSLOduration=37.503805375 podStartE2EDuration="43.941803584s" podCreationTimestamp="2025-12-01 10:45:30 +0000 UTC" firstStartedPulling="2025-12-01 10:46:06.529100516 +0000 UTC m=+883.763571414" lastFinishedPulling="2025-12-01 10:46:12.967098725 +0000 UTC m=+890.201569623" observedRunningTime="2025-12-01 10:46:13.934993195 +0000 UTC m=+891.169464093" watchObservedRunningTime="2025-12-01 10:46:13.941803584 +0000 UTC m=+891.176274482" Dec 01 10:46:14 crc kubenswrapper[4909]: I1201 10:46:14.169367 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7884db5fb-nljsq" Dec 01 10:46:15 crc kubenswrapper[4909]: I1201 10:46:15.862955 4909 generic.go:334] "Generic (PLEG): container finished" podID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerID="cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616" exitCode=0 Dec 01 10:46:15 crc kubenswrapper[4909]: I1201 10:46:15.863049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerDied","Data":"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616"} Dec 01 10:46:17 crc kubenswrapper[4909]: I1201 10:46:17.122119 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wbqsz" Dec 01 10:46:17 crc kubenswrapper[4909]: I1201 10:46:17.425911 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:17 crc kubenswrapper[4909]: I1201 10:46:17.425956 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:17 crc kubenswrapper[4909]: I1201 10:46:17.472907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:17 crc kubenswrapper[4909]: I1201 10:46:17.922687 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:18 crc kubenswrapper[4909]: I1201 10:46:18.852731 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:46:19 crc kubenswrapper[4909]: I1201 10:46:19.894797 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zvrd7" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="registry-server" containerID="cri-o://b76d6a3642507c136c285c17b4245e2c91f08924cae6f2c2d791b0617d45d16e" gracePeriod=2 Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.388823 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.389323 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.431711 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.903305 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerID="b76d6a3642507c136c285c17b4245e2c91f08924cae6f2c2d791b0617d45d16e" exitCode=0 Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.904133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerDied","Data":"b76d6a3642507c136c285c17b4245e2c91f08924cae6f2c2d791b0617d45d16e"} Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.944856 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:20 crc kubenswrapper[4909]: I1201 10:46:20.950521 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-9txzx" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.065679 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z98b8" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.239523 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-nxbp4" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.251508 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.416512 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-q7zr4" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.456988 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.462084 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-twgwh" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.593785 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content\") pod \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.593856 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities\") pod \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.593936 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpqr8\" (UniqueName: \"kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8\") pod \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\" (UID: \"3b8ef6df-1ab3-4c5c-a232-730700d94d24\") " Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.595742 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities" (OuterVolumeSpecName: "utilities") pod "3b8ef6df-1ab3-4c5c-a232-730700d94d24" (UID: "3b8ef6df-1ab3-4c5c-a232-730700d94d24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.599648 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8" (OuterVolumeSpecName: "kube-api-access-xpqr8") pod "3b8ef6df-1ab3-4c5c-a232-730700d94d24" (UID: "3b8ef6df-1ab3-4c5c-a232-730700d94d24"). InnerVolumeSpecName "kube-api-access-xpqr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.695294 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.695346 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpqr8\" (UniqueName: \"kubernetes.io/projected/3b8ef6df-1ab3-4c5c-a232-730700d94d24-kube-api-access-xpqr8\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.802022 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b8ef6df-1ab3-4c5c-a232-730700d94d24" (UID: "3b8ef6df-1ab3-4c5c-a232-730700d94d24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.804116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-dq9jz" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.898463 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b8ef6df-1ab3-4c5c-a232-730700d94d24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.916230 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerStarted","Data":"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da"} Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.919646 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zvrd7" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.920031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zvrd7" event={"ID":"3b8ef6df-1ab3-4c5c-a232-730700d94d24","Type":"ContainerDied","Data":"6652a3c35806ad841089498acd7beadeda33dadb32bd7e4422cd072b7c023e63"} Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.920087 4909 scope.go:117] "RemoveContainer" containerID="b76d6a3642507c136c285c17b4245e2c91f08924cae6f2c2d791b0617d45d16e" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.948640 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dx7kf" podStartSLOduration=3.634059572 podStartE2EDuration="12.948604474s" podCreationTimestamp="2025-12-01 10:46:09 +0000 UTC" firstStartedPulling="2025-12-01 10:46:10.807538101 +0000 UTC m=+888.042008999" lastFinishedPulling="2025-12-01 10:46:20.122083003 +0000 UTC m=+897.356553901" observedRunningTime="2025-12-01 10:46:21.942552479 +0000 UTC m=+899.177023477" watchObservedRunningTime="2025-12-01 10:46:21.948604474 +0000 UTC m=+899.183075372" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.961036 4909 scope.go:117] "RemoveContainer" containerID="9b5d9c1a9688719daa3416297f2ab02d11b230a73f738d9af5d6f2f1595e6e83" Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.967465 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.983061 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zvrd7"] Dec 01 10:46:21 crc kubenswrapper[4909]: I1201 10:46:21.997495 4909 scope.go:117] "RemoveContainer" containerID="fc14379c46d990db6d092d36b34bfd0ab730733408cdb4c3ef155e67b5a7e092" Dec 01 10:46:22 crc kubenswrapper[4909]: I1201 10:46:22.929132 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q62kw" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="registry-server" containerID="cri-o://2a322948ffd7387eafd85ba2cf5ec0e7f8ce2606da11745de36b6f8dbbd38c8b" gracePeriod=2 Dec 01 10:46:23 crc kubenswrapper[4909]: I1201 10:46:23.270859 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" path="/var/lib/kubelet/pods/3b8ef6df-1ab3-4c5c-a232-730700d94d24/volumes" Dec 01 10:46:23 crc kubenswrapper[4909]: I1201 10:46:23.534487 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l" Dec 01 10:46:24 crc kubenswrapper[4909]: I1201 10:46:24.943751 4909 generic.go:334] "Generic (PLEG): container finished" podID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerID="2a322948ffd7387eafd85ba2cf5ec0e7f8ce2606da11745de36b6f8dbbd38c8b" exitCode=0 Dec 01 10:46:24 crc kubenswrapper[4909]: I1201 10:46:24.943802 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerDied","Data":"2a322948ffd7387eafd85ba2cf5ec0e7f8ce2606da11745de36b6f8dbbd38c8b"} Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.547712 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.660574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities\") pod \"60e96450-bac5-45cf-94b2-01ad0d4a267e\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.660654 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vfl\" (UniqueName: \"kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl\") pod \"60e96450-bac5-45cf-94b2-01ad0d4a267e\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.660697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content\") pod \"60e96450-bac5-45cf-94b2-01ad0d4a267e\" (UID: \"60e96450-bac5-45cf-94b2-01ad0d4a267e\") " Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.661579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities" (OuterVolumeSpecName: "utilities") pod "60e96450-bac5-45cf-94b2-01ad0d4a267e" (UID: "60e96450-bac5-45cf-94b2-01ad0d4a267e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.668907 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl" (OuterVolumeSpecName: "kube-api-access-p2vfl") pod "60e96450-bac5-45cf-94b2-01ad0d4a267e" (UID: "60e96450-bac5-45cf-94b2-01ad0d4a267e"). InnerVolumeSpecName "kube-api-access-p2vfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.711856 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60e96450-bac5-45cf-94b2-01ad0d4a267e" (UID: "60e96450-bac5-45cf-94b2-01ad0d4a267e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.762609 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.762673 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vfl\" (UniqueName: \"kubernetes.io/projected/60e96450-bac5-45cf-94b2-01ad0d4a267e-kube-api-access-p2vfl\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.762683 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60e96450-bac5-45cf-94b2-01ad0d4a267e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.952680 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q62kw" event={"ID":"60e96450-bac5-45cf-94b2-01ad0d4a267e","Type":"ContainerDied","Data":"88a3898068de95cf9985cb17e5dff4700b37308afcdc67bb871e3bc0b991205a"} Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.952720 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q62kw" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.952745 4909 scope.go:117] "RemoveContainer" containerID="2a322948ffd7387eafd85ba2cf5ec0e7f8ce2606da11745de36b6f8dbbd38c8b" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.973582 4909 scope.go:117] "RemoveContainer" containerID="3b468f5bff9f5074def9ab914640ed72a87530c18a3e42672021e4b845c612ac" Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.983460 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:25 crc kubenswrapper[4909]: I1201 10:46:25.989267 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q62kw"] Dec 01 10:46:26 crc kubenswrapper[4909]: I1201 10:46:26.008856 4909 scope.go:117] "RemoveContainer" containerID="e2b333a0161d10b2f4203255018d22e410f835197cd4af715f3565b77be969ab" Dec 01 10:46:27 crc kubenswrapper[4909]: I1201 10:46:27.268341 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" path="/var/lib/kubelet/pods/60e96450-bac5-45cf-94b2-01ad0d4a267e/volumes" Dec 01 10:46:29 crc kubenswrapper[4909]: I1201 10:46:29.824683 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:29 crc kubenswrapper[4909]: I1201 10:46:29.825265 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:29 crc kubenswrapper[4909]: I1201 10:46:29.870963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:30 crc kubenswrapper[4909]: I1201 10:46:30.027155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:31 crc kubenswrapper[4909]: I1201 10:46:31.644054 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.001398 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dx7kf" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="registry-server" containerID="cri-o://f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da" gracePeriod=2 Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.401890 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.471126 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content\") pod \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.471208 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities\") pod \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.471320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g76fc\" (UniqueName: \"kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc\") pod \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\" (UID: \"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f\") " Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.472375 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities" (OuterVolumeSpecName: "utilities") pod "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" (UID: "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.482587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc" (OuterVolumeSpecName: "kube-api-access-g76fc") pod "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" (UID: "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f"). InnerVolumeSpecName "kube-api-access-g76fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.573649 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.573686 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g76fc\" (UniqueName: \"kubernetes.io/projected/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-kube-api-access-g76fc\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.577230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" (UID: "6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4909]: I1201 10:46:32.675572 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.014537 4909 generic.go:334] "Generic (PLEG): container finished" podID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerID="f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da" exitCode=0 Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.014601 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerDied","Data":"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da"} Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.014646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx7kf" event={"ID":"6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f","Type":"ContainerDied","Data":"b22f7d3135aae037baa86b40a5968ae98aceb8829e1f462eb4546e5b91382e52"} Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.014647 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx7kf" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.014672 4909 scope.go:117] "RemoveContainer" containerID="f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.068049 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.071112 4909 scope.go:117] "RemoveContainer" containerID="cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.091666 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dx7kf"] Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.099351 4909 scope.go:117] "RemoveContainer" containerID="0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.122131 4909 scope.go:117] "RemoveContainer" containerID="f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da" Dec 01 10:46:33 crc kubenswrapper[4909]: E1201 10:46:33.122622 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da\": container with ID starting with f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da not found: ID does not exist" containerID="f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.122663 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da"} err="failed to get container status \"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da\": rpc error: code = NotFound desc = could not find container \"f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da\": container with ID starting with f827416a0e0a026dbc9ee8d047757d08d3754a428d68662b0ee20b530c02b8da not found: ID does not exist" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.122688 4909 scope.go:117] "RemoveContainer" containerID="cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616" Dec 01 10:46:33 crc kubenswrapper[4909]: E1201 10:46:33.123072 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616\": container with ID starting with cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616 not found: ID does not exist" containerID="cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.123093 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616"} err="failed to get container status \"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616\": rpc error: code = NotFound desc = could not find container \"cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616\": container with ID starting with cc65083fd523498ade84f141e06257a432eab9770e5b2a7813795ab8f6ca6616 not found: ID does not exist" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.123109 4909 scope.go:117] "RemoveContainer" containerID="0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f" Dec 01 10:46:33 crc kubenswrapper[4909]: E1201 10:46:33.123412 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f\": container with ID starting with 0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f not found: ID does not exist" containerID="0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.123432 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f"} err="failed to get container status \"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f\": rpc error: code = NotFound desc = could not find container \"0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f\": container with ID starting with 0d8ffbe07de06ea63ec37c9dbcc96e15ed3b2a5278c65a1784b82be1c39cc43f not found: ID does not exist" Dec 01 10:46:33 crc kubenswrapper[4909]: I1201 10:46:33.269578 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" path="/var/lib/kubelet/pods/6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f/volumes" Dec 01 10:46:36 crc kubenswrapper[4909]: I1201 10:46:36.194005 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:46:36 crc kubenswrapper[4909]: I1201 10:46:36.195838 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.380974 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381698 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381714 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381725 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381732 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381739 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381745 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381757 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381763 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381774 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381779 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381796 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381803 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381815 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381820 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="extract-utilities" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381838 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: E1201 10:46:38.381849 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.381856 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="extract-content" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.382131 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8ef6df-1ab3-4c5c-a232-730700d94d24" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.382144 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e96450-bac5-45cf-94b2-01ad0d4a267e" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.382156 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f471e9c-d0ad-4d8a-a2d1-be0dd5e31e7f" containerName="registry-server" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.382999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.386936 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c8t88" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.388100 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.388456 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.389197 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.401999 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.488574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.488656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v292\" (UniqueName: \"kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.501853 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.503463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.505262 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.521167 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.590383 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v292\" (UniqueName: \"kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.591076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.591174 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.591195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t558c\" (UniqueName: \"kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.591237 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.592058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.609715 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v292\" (UniqueName: \"kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292\") pod \"dnsmasq-dns-675f4bcbfc-xq8w6\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.692528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.693475 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.694362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.694420 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.694469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t558c\" (UniqueName: \"kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.698841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.712573 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t558c\" (UniqueName: \"kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c\") pod \"dnsmasq-dns-78dd6ddcc-g29zq\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:38 crc kubenswrapper[4909]: I1201 10:46:38.825448 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:46:39 crc kubenswrapper[4909]: I1201 10:46:39.143731 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:46:39 crc kubenswrapper[4909]: W1201 10:46:39.147898 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b3ea705_85ca_404c_b70e_41d9018d9f7d.slice/crio-204f6537dd6bab84a9b8ddeb850e580a673eaaabd6b1b2625dd0b06b5513147d WatchSource:0}: Error finding container 204f6537dd6bab84a9b8ddeb850e580a673eaaabd6b1b2625dd0b06b5513147d: Status 404 returned error can't find the container with id 204f6537dd6bab84a9b8ddeb850e580a673eaaabd6b1b2625dd0b06b5513147d Dec 01 10:46:39 crc kubenswrapper[4909]: I1201 10:46:39.151112 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:46:39 crc kubenswrapper[4909]: I1201 10:46:39.287069 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:46:39 crc kubenswrapper[4909]: W1201 10:46:39.293403 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4507f0c4_3667_4e30_9e41_45e864fc724a.slice/crio-097c7477fef6fa2a47cdcdd8d88533263c138bdc7fd18f16928053e8961ec88a WatchSource:0}: Error finding container 097c7477fef6fa2a47cdcdd8d88533263c138bdc7fd18f16928053e8961ec88a: Status 404 returned error can't find the container with id 097c7477fef6fa2a47cdcdd8d88533263c138bdc7fd18f16928053e8961ec88a Dec 01 10:46:40 crc kubenswrapper[4909]: I1201 10:46:40.077098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" event={"ID":"4507f0c4-3667-4e30-9e41-45e864fc724a","Type":"ContainerStarted","Data":"097c7477fef6fa2a47cdcdd8d88533263c138bdc7fd18f16928053e8961ec88a"} Dec 01 10:46:40 crc kubenswrapper[4909]: I1201 10:46:40.078980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" event={"ID":"9b3ea705-85ca-404c-b70e-41d9018d9f7d","Type":"ContainerStarted","Data":"204f6537dd6bab84a9b8ddeb850e580a673eaaabd6b1b2625dd0b06b5513147d"} Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.540152 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.615508 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.617020 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.644182 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.762370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44zd\" (UniqueName: \"kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.762471 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.762524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.865442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.865530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.865616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44zd\" (UniqueName: \"kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.866959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.867490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.893327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44zd\" (UniqueName: \"kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd\") pod \"dnsmasq-dns-666b6646f7-d9qt5\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.937822 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.947341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.963386 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.977110 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:41 crc kubenswrapper[4909]: I1201 10:46:41.985621 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.072420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.072959 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.073020 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxt2\" (UniqueName: \"kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.182249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.182310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.182347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxt2\" (UniqueName: \"kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.183485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.183532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.224467 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxt2\" (UniqueName: \"kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2\") pod \"dnsmasq-dns-57d769cc4f-sl7td\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.332827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.536006 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:46:42 crc kubenswrapper[4909]: W1201 10:46:42.553106 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef13416_f63b_4949_bf3e_83e9b95c58fc.slice/crio-a2e8b5c0d01293e501850130a9c07c23e394a871d12e19df1053bfbf30a1e7ca WatchSource:0}: Error finding container a2e8b5c0d01293e501850130a9c07c23e394a871d12e19df1053bfbf30a1e7ca: Status 404 returned error can't find the container with id a2e8b5c0d01293e501850130a9c07c23e394a871d12e19df1053bfbf30a1e7ca Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.785092 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.787288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796046 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796331 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cvcbl" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796514 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796515 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796659 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796796 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.796851 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.797125 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.877998 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.897796 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.897979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwwb\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898039 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898525 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:42 crc kubenswrapper[4909]: I1201 10:46:42.898574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.000782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.000867 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.000917 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.000958 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001045 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwwb\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001662 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.001921 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.002175 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.002197 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.002240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.004037 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.010368 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.010560 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.024100 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.025726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.026013 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwwb\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.034207 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.129692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.134529 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" event={"ID":"e2ecb921-0356-4a1d-93dd-1fb91e41e081","Type":"ContainerStarted","Data":"e254f8739b11037302726b36760e5c1d60fd38cf830e73eeda201d079043cee1"} Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.136265 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" event={"ID":"7ef13416-f63b-4949-bf3e-83e9b95c58fc","Type":"ContainerStarted","Data":"a2e8b5c0d01293e501850130a9c07c23e394a871d12e19df1053bfbf30a1e7ca"} Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.173933 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.175245 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.181625 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.181895 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pcgf7" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.181966 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.182043 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.182063 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.182065 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.182287 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.194543 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305468 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305492 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305541 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl7d\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.305680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407707 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407729 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407768 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407810 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.407850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl7d\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.409293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.409296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.409361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.409442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.409810 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.411641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.414894 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.415024 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.417558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.417835 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.421684 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.422647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.422005 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.429976 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl7d\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.461165 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:43 crc kubenswrapper[4909]: I1201 10:46:43.518104 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.573465 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.576379 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.578673 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.580275 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wmxzn" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.581283 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.581550 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.586220 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.597381 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.633916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634004 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-kolla-config\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b522c19-139d-41b2-ad31-94e12157a398-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634104 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-844rx\" (UniqueName: \"kubernetes.io/projected/5b522c19-139d-41b2-ad31-94e12157a398-kube-api-access-844rx\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634131 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634179 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.634205 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-config-data-default\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.736355 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.736445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.736913 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-kolla-config\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.736969 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b522c19-139d-41b2-ad31-94e12157a398-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.736990 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-844rx\" (UniqueName: \"kubernetes.io/projected/5b522c19-139d-41b2-ad31-94e12157a398-kube-api-access-844rx\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.737021 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.737061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.737088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-config-data-default\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.737517 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b522c19-139d-41b2-ad31-94e12157a398-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.740061 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.740080 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-config-data-default\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.740907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-kolla-config\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.740931 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b522c19-139d-41b2-ad31-94e12157a398-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.745898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.758773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b522c19-139d-41b2-ad31-94e12157a398-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.759948 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-844rx\" (UniqueName: \"kubernetes.io/projected/5b522c19-139d-41b2-ad31-94e12157a398-kube-api-access-844rx\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.776009 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"5b522c19-139d-41b2-ad31-94e12157a398\") " pod="openstack/openstack-galera-0" Dec 01 10:46:44 crc kubenswrapper[4909]: I1201 10:46:44.914626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.967961 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.974768 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.977884 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.982817 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.984520 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.984849 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g8vn5" Dec 01 10:46:45 crc kubenswrapper[4909]: I1201 10:46:45.994426 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjkh\" (UniqueName: \"kubernetes.io/projected/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kube-api-access-jmjkh\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070428 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070527 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070618 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070867 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.070921 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.160358 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.161982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.168214 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.168324 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-grbll" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.168414 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172424 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-config-data\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjkh\" (UniqueName: \"kubernetes.io/projected/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kube-api-access-jmjkh\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172576 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172605 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7p4\" (UniqueName: \"kubernetes.io/projected/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kube-api-access-8b7p4\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172746 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kolla-config\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.172789 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.173240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.178234 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.178521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.180813 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.181553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.185217 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.195898 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.204775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.218131 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.234583 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjkh\" (UniqueName: \"kubernetes.io/projected/b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9-kube-api-access-jmjkh\") pod \"openstack-cell1-galera-0\" (UID: \"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.273945 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7p4\" (UniqueName: \"kubernetes.io/projected/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kube-api-access-8b7p4\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.274040 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kolla-config\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.274103 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.274162 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-config-data\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.274205 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.281062 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.282633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kolla-config\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.283172 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d516e7e6-4b24-41b8-bde1-d533d1e77d61-config-data\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.297302 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.308404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d516e7e6-4b24-41b8-bde1-d533d1e77d61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.322798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7p4\" (UniqueName: \"kubernetes.io/projected/d516e7e6-4b24-41b8-bde1-d533d1e77d61-kube-api-access-8b7p4\") pod \"memcached-0\" (UID: \"d516e7e6-4b24-41b8-bde1-d533d1e77d61\") " pod="openstack/memcached-0" Dec 01 10:46:46 crc kubenswrapper[4909]: I1201 10:46:46.484425 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.053855 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.055417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.062235 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fstqv" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.078575 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.107070 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46dht\" (UniqueName: \"kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht\") pod \"kube-state-metrics-0\" (UID: \"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8\") " pod="openstack/kube-state-metrics-0" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.211128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46dht\" (UniqueName: \"kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht\") pod \"kube-state-metrics-0\" (UID: \"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8\") " pod="openstack/kube-state-metrics-0" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.231992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46dht\" (UniqueName: \"kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht\") pod \"kube-state-metrics-0\" (UID: \"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8\") " pod="openstack/kube-state-metrics-0" Dec 01 10:46:48 crc kubenswrapper[4909]: I1201 10:46:48.378212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.317309 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zcgpf"] Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.319063 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.322833 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-c6mpr" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.322949 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.323131 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.328795 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zcgpf"] Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.336692 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-z64hl"] Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.339987 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.348589 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z64hl"] Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-combined-ca-bundle\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378541 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-scripts\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378566 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-ovn-controller-tls-certs\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378629 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-log-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.378671 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2hwd\" (UniqueName: \"kubernetes.io/projected/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-kube-api-access-k2hwd\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.479966 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-combined-ca-bundle\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480450 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-log\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvdrf\" (UniqueName: \"kubernetes.io/projected/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-kube-api-access-nvdrf\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-etc-ovs\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480547 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-scripts\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480571 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480624 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-ovn-controller-tls-certs\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-log-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480725 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2hwd\" (UniqueName: \"kubernetes.io/projected/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-kube-api-access-k2hwd\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-scripts\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480807 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480900 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-run\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.480976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-lib\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.481327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.481436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-log-ovn\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.481624 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-var-run\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.483389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-scripts\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.487394 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-combined-ca-bundle\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.489203 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-ovn-controller-tls-certs\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.501675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2hwd\" (UniqueName: \"kubernetes.io/projected/a2ab5fcc-f33d-4495-a7e4-c4305f3e846e-kube-api-access-k2hwd\") pod \"ovn-controller-zcgpf\" (UID: \"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e\") " pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-log\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvdrf\" (UniqueName: \"kubernetes.io/projected/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-kube-api-access-nvdrf\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-etc-ovs\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-scripts\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-run\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582747 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-lib\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.582969 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-log\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.583012 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-lib\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.583074 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-var-run\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.583221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-etc-ovs\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.585627 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-scripts\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.627688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvdrf\" (UniqueName: \"kubernetes.io/projected/cfbdb448-fb4d-48cd-8fa4-b9309172c7f4-kube-api-access-nvdrf\") pod \"ovn-controller-ovs-z64hl\" (UID: \"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4\") " pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.641320 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf" Dec 01 10:46:51 crc kubenswrapper[4909]: I1201 10:46:51.673227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.003983 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.986297 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.987677 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.993536 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tkmdr" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.993767 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.993903 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.994112 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 10:46:52 crc kubenswrapper[4909]: I1201 10:46:52.994261 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.015833 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120492 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrpq\" (UniqueName: \"kubernetes.io/projected/dcc01640-a3cc-49fe-b49a-ef344e34d793-kube-api-access-ltrpq\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120544 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120608 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120628 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.120672 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.222584 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.222670 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrpq\" (UniqueName: \"kubernetes.io/projected/dcc01640-a3cc-49fe-b49a-ef344e34d793-kube-api-access-ltrpq\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.222737 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.222850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.222913 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.223007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.223057 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.223091 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.223149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.223636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.224590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-config\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.225346 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc01640-a3cc-49fe-b49a-ef344e34d793-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.235070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.243094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.262124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc01640-a3cc-49fe-b49a-ef344e34d793-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.269124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrpq\" (UniqueName: \"kubernetes.io/projected/dcc01640-a3cc-49fe-b49a-ef344e34d793-kube-api-access-ltrpq\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.304384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dcc01640-a3cc-49fe-b49a-ef344e34d793\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:53 crc kubenswrapper[4909]: I1201 10:46:53.361316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.418164 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.419793 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.425184 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.425511 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-krnpz" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.425704 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.425822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.440250 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483272 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483570 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-config\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwm4m\" (UniqueName: \"kubernetes.io/projected/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-kube-api-access-mwm4m\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.483768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwm4m\" (UniqueName: \"kubernetes.io/projected/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-kube-api-access-mwm4m\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585393 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585451 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585704 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-config\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.585825 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.588132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.589432 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-config\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.591132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.594927 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.597487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.597581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.606110 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwm4m\" (UniqueName: \"kubernetes.io/projected/c44b2fd1-af1d-4e0a-8316-eedf732df3ce-kube-api-access-mwm4m\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.620617 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c44b2fd1-af1d-4e0a-8316-eedf732df3ce\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:46:55 crc kubenswrapper[4909]: I1201 10:46:55.805434 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:47:01 crc kubenswrapper[4909]: I1201 10:47:01.302123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerStarted","Data":"70fdfaffb7938545bb4616a640ded8cd8644487fb24cb64f6b1eae8c7fe0e856"} Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.586912 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.588090 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t558c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-g29zq_openstack(4507f0c4-3667-4e30-9e41-45e864fc724a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.589435 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" podUID="4507f0c4-3667-4e30-9e41-45e864fc724a" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.618402 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.618646 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6v292,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-xq8w6_openstack(9b3ea705-85ca-404c-b70e-41d9018d9f7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.629156 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" podUID="9b3ea705-85ca-404c-b70e-41d9018d9f7d" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.636660 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.636890 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mxt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-sl7td_openstack(e2ecb921-0356-4a1d-93dd-1fb91e41e081): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:47:01 crc kubenswrapper[4909]: E1201 10:47:01.638154 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.194030 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:47:02 crc kubenswrapper[4909]: W1201 10:47:02.207644 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226ba07f_6dee_4f12_9d0e_4ae327457c2e.slice/crio-6f67d1f2c7376924ecb88bdb43bcc5af5dd5f5e04bd11ac9f731f0ed2ef413c4 WatchSource:0}: Error finding container 6f67d1f2c7376924ecb88bdb43bcc5af5dd5f5e04bd11ac9f731f0ed2ef413c4: Status 404 returned error can't find the container with id 6f67d1f2c7376924ecb88bdb43bcc5af5dd5f5e04bd11ac9f731f0ed2ef413c4 Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.312822 4909 generic.go:334] "Generic (PLEG): container finished" podID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerID="330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04" exitCode=0 Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.313348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" event={"ID":"7ef13416-f63b-4949-bf3e-83e9b95c58fc","Type":"ContainerDied","Data":"330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04"} Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.320811 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerStarted","Data":"6f67d1f2c7376924ecb88bdb43bcc5af5dd5f5e04bd11ac9f731f0ed2ef413c4"} Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.900393 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.926207 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.945616 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 10:47:02 crc kubenswrapper[4909]: W1201 10:47:02.951281 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b522c19_139d_41b2_ad31_94e12157a398.slice/crio-3ba4ac3777adcf0882ece1b4f595714ffbb1d13888ace2d2d6b2e02acccc5b9c WatchSource:0}: Error finding container 3ba4ac3777adcf0882ece1b4f595714ffbb1d13888ace2d2d6b2e02acccc5b9c: Status 404 returned error can't find the container with id 3ba4ac3777adcf0882ece1b4f595714ffbb1d13888ace2d2d6b2e02acccc5b9c Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.952014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:47:02 crc kubenswrapper[4909]: W1201 10:47:02.958263 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd516e7e6_4b24_41b8_bde1_d533d1e77d61.slice/crio-83b77c007123340e9ff2bba6ab4dbbc79743e13f1afb3e0178c516d93a11f886 WatchSource:0}: Error finding container 83b77c007123340e9ff2bba6ab4dbbc79743e13f1afb3e0178c516d93a11f886: Status 404 returned error can't find the container with id 83b77c007123340e9ff2bba6ab4dbbc79743e13f1afb3e0178c516d93a11f886 Dec 01 10:47:02 crc kubenswrapper[4909]: W1201 10:47:02.970974 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9b91481_b2f9_4ad4_8720_6b9f7d27c2b9.slice/crio-f313904ab5c487529e594e62773120cc6752650d61cafdd3ff3796bc34a1d033 WatchSource:0}: Error finding container f313904ab5c487529e594e62773120cc6752650d61cafdd3ff3796bc34a1d033: Status 404 returned error can't find the container with id f313904ab5c487529e594e62773120cc6752650d61cafdd3ff3796bc34a1d033 Dec 01 10:47:02 crc kubenswrapper[4909]: W1201 10:47:02.982213 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b2b1919_713a_48ec_9bed_34d7d2c8bfc8.slice/crio-1c469f52da707e5f2f40d50769f73f9af0eadb41d65d9e559d641b989feae197 WatchSource:0}: Error finding container 1c469f52da707e5f2f40d50769f73f9af0eadb41d65d9e559d641b989feae197: Status 404 returned error can't find the container with id 1c469f52da707e5f2f40d50769f73f9af0eadb41d65d9e559d641b989feae197 Dec 01 10:47:02 crc kubenswrapper[4909]: I1201 10:47:02.999539 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.004592 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.100066 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.153708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v292\" (UniqueName: \"kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292\") pod \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.154143 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t558c\" (UniqueName: \"kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c\") pod \"4507f0c4-3667-4e30-9e41-45e864fc724a\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.154222 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config\") pod \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\" (UID: \"9b3ea705-85ca-404c-b70e-41d9018d9f7d\") " Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.154283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config\") pod \"4507f0c4-3667-4e30-9e41-45e864fc724a\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.154391 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc\") pod \"4507f0c4-3667-4e30-9e41-45e864fc724a\" (UID: \"4507f0c4-3667-4e30-9e41-45e864fc724a\") " Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.155605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4507f0c4-3667-4e30-9e41-45e864fc724a" (UID: "4507f0c4-3667-4e30-9e41-45e864fc724a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.156701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config" (OuterVolumeSpecName: "config") pod "9b3ea705-85ca-404c-b70e-41d9018d9f7d" (UID: "9b3ea705-85ca-404c-b70e-41d9018d9f7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.164817 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c" (OuterVolumeSpecName: "kube-api-access-t558c") pod "4507f0c4-3667-4e30-9e41-45e864fc724a" (UID: "4507f0c4-3667-4e30-9e41-45e864fc724a"). InnerVolumeSpecName "kube-api-access-t558c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.165374 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config" (OuterVolumeSpecName: "config") pod "4507f0c4-3667-4e30-9e41-45e864fc724a" (UID: "4507f0c4-3667-4e30-9e41-45e864fc724a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.168184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292" (OuterVolumeSpecName: "kube-api-access-6v292") pod "9b3ea705-85ca-404c-b70e-41d9018d9f7d" (UID: "9b3ea705-85ca-404c-b70e-41d9018d9f7d"). InnerVolumeSpecName "kube-api-access-6v292". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.258125 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b3ea705-85ca-404c-b70e-41d9018d9f7d-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.258168 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.258182 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4507f0c4-3667-4e30-9e41-45e864fc724a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.258197 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v292\" (UniqueName: \"kubernetes.io/projected/9b3ea705-85ca-404c-b70e-41d9018d9f7d-kube-api-access-6v292\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.258210 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t558c\" (UniqueName: \"kubernetes.io/projected/4507f0c4-3667-4e30-9e41-45e864fc724a-kube-api-access-t558c\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.382118 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zcgpf"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.394732 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" event={"ID":"e2ecb921-0356-4a1d-93dd-1fb91e41e081","Type":"ContainerDied","Data":"9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.394627 4909 generic.go:334] "Generic (PLEG): container finished" podID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerID="9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608" exitCode=0 Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.411587 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" event={"ID":"7ef13416-f63b-4949-bf3e-83e9b95c58fc","Type":"ContainerStarted","Data":"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.432191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcc01640-a3cc-49fe-b49a-ef344e34d793","Type":"ContainerStarted","Data":"187e983e8171f9bd444e4aa7e65a46d6b19af7a1d817ed3bf137ce0cda8a36fe"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.438003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9","Type":"ContainerStarted","Data":"f313904ab5c487529e594e62773120cc6752650d61cafdd3ff3796bc34a1d033"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.450731 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5b522c19-139d-41b2-ad31-94e12157a398","Type":"ContainerStarted","Data":"3ba4ac3777adcf0882ece1b4f595714ffbb1d13888ace2d2d6b2e02acccc5b9c"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.459113 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.459131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g29zq" event={"ID":"4507f0c4-3667-4e30-9e41-45e864fc724a","Type":"ContainerDied","Data":"097c7477fef6fa2a47cdcdd8d88533263c138bdc7fd18f16928053e8961ec88a"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.463702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" event={"ID":"9b3ea705-85ca-404c-b70e-41d9018d9f7d","Type":"ContainerDied","Data":"204f6537dd6bab84a9b8ddeb850e580a673eaaabd6b1b2625dd0b06b5513147d"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.465728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8","Type":"ContainerStarted","Data":"1c469f52da707e5f2f40d50769f73f9af0eadb41d65d9e559d641b989feae197"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.465915 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xq8w6" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.466818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d516e7e6-4b24-41b8-bde1-d533d1e77d61","Type":"ContainerStarted","Data":"83b77c007123340e9ff2bba6ab4dbbc79743e13f1afb3e0178c516d93a11f886"} Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.534153 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:47:03 crc kubenswrapper[4909]: W1201 10:47:03.565782 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44b2fd1_af1d_4e0a_8316_eedf732df3ce.slice/crio-c2f24c54e0ac3dcdfcc1ec912cb312e0be57a407a23a4662db2878a0db78d466 WatchSource:0}: Error finding container c2f24c54e0ac3dcdfcc1ec912cb312e0be57a407a23a4662db2878a0db78d466: Status 404 returned error can't find the container with id c2f24c54e0ac3dcdfcc1ec912cb312e0be57a407a23a4662db2878a0db78d466 Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.590319 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.597953 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xq8w6"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.615064 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.625151 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g29zq"] Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.627366 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" podStartSLOduration=3.428850818 podStartE2EDuration="22.627347461s" podCreationTimestamp="2025-12-01 10:46:41 +0000 UTC" firstStartedPulling="2025-12-01 10:46:42.562688812 +0000 UTC m=+919.797159710" lastFinishedPulling="2025-12-01 10:47:01.761185455 +0000 UTC m=+938.995656353" observedRunningTime="2025-12-01 10:47:03.604813069 +0000 UTC m=+940.839283977" watchObservedRunningTime="2025-12-01 10:47:03.627347461 +0000 UTC m=+940.861818359" Dec 01 10:47:03 crc kubenswrapper[4909]: I1201 10:47:03.852270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z64hl"] Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.477444 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zcgpf" event={"ID":"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e","Type":"ContainerStarted","Data":"9ff867133689cfd9e8ae1af53a0e260af3c44ebbcb4242ec119099b85d1bcc6f"} Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.479622 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z64hl" event={"ID":"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4","Type":"ContainerStarted","Data":"6f1b794c8caef87e18156da10220f8f92c30836c2c89c5e7d83869334342bf79"} Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.481909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" event={"ID":"e2ecb921-0356-4a1d-93dd-1fb91e41e081","Type":"ContainerStarted","Data":"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60"} Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.483312 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.496211 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c44b2fd1-af1d-4e0a-8316-eedf732df3ce","Type":"ContainerStarted","Data":"c2f24c54e0ac3dcdfcc1ec912cb312e0be57a407a23a4662db2878a0db78d466"} Dec 01 10:47:04 crc kubenswrapper[4909]: I1201 10:47:04.496262 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:47:05 crc kubenswrapper[4909]: I1201 10:47:05.270550 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4507f0c4-3667-4e30-9e41-45e864fc724a" path="/var/lib/kubelet/pods/4507f0c4-3667-4e30-9e41-45e864fc724a/volumes" Dec 01 10:47:05 crc kubenswrapper[4909]: I1201 10:47:05.271452 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3ea705-85ca-404c-b70e-41d9018d9f7d" path="/var/lib/kubelet/pods/9b3ea705-85ca-404c-b70e-41d9018d9f7d/volumes" Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.193206 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.193288 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.193334 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.194124 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.194191 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8" gracePeriod=600 Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.520690 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8" exitCode=0 Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.520762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8"} Dec 01 10:47:06 crc kubenswrapper[4909]: I1201 10:47:06.521123 4909 scope.go:117] "RemoveContainer" containerID="1261d57afc6b7af0172cc6d97bb6e0cf382f59bc9c526de8c48bb45bac9b39b3" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.497848 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" podStartSLOduration=-9223372009.356981 podStartE2EDuration="27.49779419s" podCreationTimestamp="2025-12-01 10:46:41 +0000 UTC" firstStartedPulling="2025-12-01 10:46:42.912412951 +0000 UTC m=+920.146883849" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:47:04.505749483 +0000 UTC m=+941.740220381" watchObservedRunningTime="2025-12-01 10:47:08.49779419 +0000 UTC m=+945.732265098" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.500707 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.504570 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.535450 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.682692 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4888\" (UniqueName: \"kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.682782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.682835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.784962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.785062 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.785172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4888\" (UniqueName: \"kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.786212 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.786525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.842148 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4888\" (UniqueName: \"kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888\") pod \"certified-operators-gr6sc\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:08 crc kubenswrapper[4909]: I1201 10:47:08.844610 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:11 crc kubenswrapper[4909]: I1201 10:47:11.949137 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:47:12 crc kubenswrapper[4909]: I1201 10:47:12.335089 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:47:12 crc kubenswrapper[4909]: I1201 10:47:12.421561 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:47:12 crc kubenswrapper[4909]: I1201 10:47:12.559961 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:12 crc kubenswrapper[4909]: I1201 10:47:12.584651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1"} Dec 01 10:47:12 crc kubenswrapper[4909]: I1201 10:47:12.585143 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="dnsmasq-dns" containerID="cri-o://c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a" gracePeriod=10 Dec 01 10:47:12 crc kubenswrapper[4909]: W1201 10:47:12.861274 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod103f4624_c750_4f9f_95c8_38dee6c94e77.slice/crio-f2782dd79c25069c2f684a02b8243e28e4c4de717f9f7ab90005cc67528d6a57 WatchSource:0}: Error finding container f2782dd79c25069c2f684a02b8243e28e4c4de717f9f7ab90005cc67528d6a57: Status 404 returned error can't find the container with id f2782dd79c25069c2f684a02b8243e28e4c4de717f9f7ab90005cc67528d6a57 Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.274428 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.384616 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config\") pod \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.384699 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l44zd\" (UniqueName: \"kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd\") pod \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.384804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc\") pod \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\" (UID: \"7ef13416-f63b-4949-bf3e-83e9b95c58fc\") " Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.402349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd" (OuterVolumeSpecName: "kube-api-access-l44zd") pod "7ef13416-f63b-4949-bf3e-83e9b95c58fc" (UID: "7ef13416-f63b-4949-bf3e-83e9b95c58fc"). InnerVolumeSpecName "kube-api-access-l44zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.487134 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l44zd\" (UniqueName: \"kubernetes.io/projected/7ef13416-f63b-4949-bf3e-83e9b95c58fc-kube-api-access-l44zd\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.597147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zcgpf" event={"ID":"a2ab5fcc-f33d-4495-a7e4-c4305f3e846e","Type":"ContainerStarted","Data":"f35868b9c0102683a93e26aa8e8fd2ff4f2af524360a4cd11807b6e0538cfd78"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.598095 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zcgpf" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.599900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9","Type":"ContainerStarted","Data":"cebfaab6c60faaac61101acf0adc0eb6760a6ce7e66710c514f54b4880f783cd"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.601651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5b522c19-139d-41b2-ad31-94e12157a398","Type":"ContainerStarted","Data":"c4f6fd36e24068b3587b2ec2862cfeb625bb4ac67efb1e9247825f2a10e24eb6"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.602839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c44b2fd1-af1d-4e0a-8316-eedf732df3ce","Type":"ContainerStarted","Data":"8fcef7ebf2e9532e52a9af718bd4b323efcab5e91f0d30156f2018bc131e5d2f"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.604259 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d516e7e6-4b24-41b8-bde1-d533d1e77d61","Type":"ContainerStarted","Data":"36f70cb09fad2e03eb057059fdfb2bf7be2aca72ecc66076662bb1e55f6151c2"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.605010 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.606657 4909 generic.go:334] "Generic (PLEG): container finished" podID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerID="9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3" exitCode=0 Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.606720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerDied","Data":"9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.606740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerStarted","Data":"f2782dd79c25069c2f684a02b8243e28e4c4de717f9f7ab90005cc67528d6a57"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.610090 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z64hl" event={"ID":"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4","Type":"ContainerStarted","Data":"31c894f82778739abb188712e220e8c73507adf22d4d935df78fa09f9c0f5b59"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.612686 4909 generic.go:334] "Generic (PLEG): container finished" podID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerID="c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a" exitCode=0 Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.612741 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.612776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" event={"ID":"7ef13416-f63b-4949-bf3e-83e9b95c58fc","Type":"ContainerDied","Data":"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.612821 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d9qt5" event={"ID":"7ef13416-f63b-4949-bf3e-83e9b95c58fc","Type":"ContainerDied","Data":"a2e8b5c0d01293e501850130a9c07c23e394a871d12e19df1053bfbf30a1e7ca"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.612843 4909 scope.go:117] "RemoveContainer" containerID="c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.620051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcc01640-a3cc-49fe-b49a-ef344e34d793","Type":"ContainerStarted","Data":"310714d38ad6f65952ed154e8af187fcc68c2815c4e22778ae987b82da80c7a6"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.623565 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zcgpf" podStartSLOduration=13.167303068 podStartE2EDuration="22.623542702s" podCreationTimestamp="2025-12-01 10:46:51 +0000 UTC" firstStartedPulling="2025-12-01 10:47:03.413179404 +0000 UTC m=+940.647650302" lastFinishedPulling="2025-12-01 10:47:12.869419028 +0000 UTC m=+950.103889936" observedRunningTime="2025-12-01 10:47:13.615679417 +0000 UTC m=+950.850150315" watchObservedRunningTime="2025-12-01 10:47:13.623542702 +0000 UTC m=+950.858013600" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.624254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8","Type":"ContainerStarted","Data":"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2"} Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.624310 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.677359 4909 scope.go:117] "RemoveContainer" containerID="330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.726548 4909 scope.go:117] "RemoveContainer" containerID="c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a" Dec 01 10:47:13 crc kubenswrapper[4909]: E1201 10:47:13.727817 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a\": container with ID starting with c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a not found: ID does not exist" containerID="c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.727896 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a"} err="failed to get container status \"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a\": rpc error: code = NotFound desc = could not find container \"c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a\": container with ID starting with c18c0577e75a71c3b025191ed22a14b55951234fe3a376eb602e15f4cd1f185a not found: ID does not exist" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.727929 4909 scope.go:117] "RemoveContainer" containerID="330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04" Dec 01 10:47:13 crc kubenswrapper[4909]: E1201 10:47:13.728162 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04\": container with ID starting with 330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04 not found: ID does not exist" containerID="330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.728189 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04"} err="failed to get container status \"330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04\": rpc error: code = NotFound desc = could not find container \"330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04\": container with ID starting with 330622f0c048cfbbd90af2b0ce802b91a4b46107ffcc723cf5fe12c0c14b2a04 not found: ID does not exist" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.739860 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.868637249 podStartE2EDuration="27.73984147s" podCreationTimestamp="2025-12-01 10:46:46 +0000 UTC" firstStartedPulling="2025-12-01 10:47:02.992652956 +0000 UTC m=+940.227123844" lastFinishedPulling="2025-12-01 10:47:12.863857167 +0000 UTC m=+950.098328065" observedRunningTime="2025-12-01 10:47:13.739684155 +0000 UTC m=+950.974155063" watchObservedRunningTime="2025-12-01 10:47:13.73984147 +0000 UTC m=+950.974312368" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.800107 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.684791657 podStartE2EDuration="25.800084317s" podCreationTimestamp="2025-12-01 10:46:48 +0000 UTC" firstStartedPulling="2025-12-01 10:47:02.98602383 +0000 UTC m=+940.220494728" lastFinishedPulling="2025-12-01 10:47:13.10131649 +0000 UTC m=+950.335787388" observedRunningTime="2025-12-01 10:47:13.787221309 +0000 UTC m=+951.021692227" watchObservedRunningTime="2025-12-01 10:47:13.800084317 +0000 UTC m=+951.034555215" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.833574 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config" (OuterVolumeSpecName: "config") pod "7ef13416-f63b-4949-bf3e-83e9b95c58fc" (UID: "7ef13416-f63b-4949-bf3e-83e9b95c58fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:13 crc kubenswrapper[4909]: I1201 10:47:13.901179 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.326927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ef13416-f63b-4949-bf3e-83e9b95c58fc" (UID: "7ef13416-f63b-4949-bf3e-83e9b95c58fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.410799 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef13416-f63b-4949-bf3e-83e9b95c58fc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.550117 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.558842 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d9qt5"] Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.636421 4909 generic.go:334] "Generic (PLEG): container finished" podID="cfbdb448-fb4d-48cd-8fa4-b9309172c7f4" containerID="31c894f82778739abb188712e220e8c73507adf22d4d935df78fa09f9c0f5b59" exitCode=0 Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.636492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z64hl" event={"ID":"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4","Type":"ContainerDied","Data":"31c894f82778739abb188712e220e8c73507adf22d4d935df78fa09f9c0f5b59"} Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.644666 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerStarted","Data":"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a"} Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.649459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerStarted","Data":"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59"} Dec 01 10:47:14 crc kubenswrapper[4909]: I1201 10:47:14.673517 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerStarted","Data":"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0"} Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.272136 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" path="/var/lib/kubelet/pods/7ef13416-f63b-4949-bf3e-83e9b95c58fc/volumes" Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.682069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z64hl" event={"ID":"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4","Type":"ContainerStarted","Data":"4c0bf00b999e5b90407e0e1d6132ee2dde7550eab89600b964457aec2c0a35d7"} Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.682121 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z64hl" event={"ID":"cfbdb448-fb4d-48cd-8fa4-b9309172c7f4","Type":"ContainerStarted","Data":"d50448107fff28a24753ebf6c2045ee3078e72fcf089cfcbf66fdd9c963296b5"} Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.682192 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.684836 4909 generic.go:334] "Generic (PLEG): container finished" podID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerID="fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0" exitCode=0 Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.684902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerDied","Data":"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0"} Dec 01 10:47:15 crc kubenswrapper[4909]: I1201 10:47:15.705917 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-z64hl" podStartSLOduration=17.049740387 podStartE2EDuration="24.70589738s" podCreationTimestamp="2025-12-01 10:46:51 +0000 UTC" firstStartedPulling="2025-12-01 10:47:04.230128551 +0000 UTC m=+941.464599449" lastFinishedPulling="2025-12-01 10:47:11.886285544 +0000 UTC m=+949.120756442" observedRunningTime="2025-12-01 10:47:15.702914724 +0000 UTC m=+952.937385652" watchObservedRunningTime="2025-12-01 10:47:15.70589738 +0000 UTC m=+952.940368298" Dec 01 10:47:16 crc kubenswrapper[4909]: I1201 10:47:16.674222 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:47:17 crc kubenswrapper[4909]: I1201 10:47:17.717751 4909 generic.go:334] "Generic (PLEG): container finished" podID="b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9" containerID="cebfaab6c60faaac61101acf0adc0eb6760a6ce7e66710c514f54b4880f783cd" exitCode=0 Dec 01 10:47:17 crc kubenswrapper[4909]: I1201 10:47:17.717837 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9","Type":"ContainerDied","Data":"cebfaab6c60faaac61101acf0adc0eb6760a6ce7e66710c514f54b4880f783cd"} Dec 01 10:47:17 crc kubenswrapper[4909]: I1201 10:47:17.722122 4909 generic.go:334] "Generic (PLEG): container finished" podID="5b522c19-139d-41b2-ad31-94e12157a398" containerID="c4f6fd36e24068b3587b2ec2862cfeb625bb4ac67efb1e9247825f2a10e24eb6" exitCode=0 Dec 01 10:47:17 crc kubenswrapper[4909]: I1201 10:47:17.722173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5b522c19-139d-41b2-ad31-94e12157a398","Type":"ContainerDied","Data":"c4f6fd36e24068b3587b2ec2862cfeb625bb4ac67efb1e9247825f2a10e24eb6"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.393207 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.734397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9","Type":"ContainerStarted","Data":"99aea88aeb1924d85ed27da0149c80859f89bc3ffdbd991b086e7a69fb679932"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.737222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dcc01640-a3cc-49fe-b49a-ef344e34d793","Type":"ContainerStarted","Data":"2f53c2aefb460faced9e3084645325fa52e96ab77f2645d3f1783976e80dfdee"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.740190 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5b522c19-139d-41b2-ad31-94e12157a398","Type":"ContainerStarted","Data":"27e1767f6a2a802caed4fb1507d22dc44fab58bdc7fb58fa4ca9c848f81bdd63"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.743005 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c44b2fd1-af1d-4e0a-8316-eedf732df3ce","Type":"ContainerStarted","Data":"589af6a9a07346ebed4f1c1b36506aab48aa1fc87ebeb1036dee2b199a4286cc"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.745051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerStarted","Data":"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd"} Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.764459 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.89647034 podStartE2EDuration="34.764440476s" podCreationTimestamp="2025-12-01 10:46:44 +0000 UTC" firstStartedPulling="2025-12-01 10:47:02.975193029 +0000 UTC m=+940.209663927" lastFinishedPulling="2025-12-01 10:47:12.843163165 +0000 UTC m=+950.077634063" observedRunningTime="2025-12-01 10:47:18.758906127 +0000 UTC m=+955.993377045" watchObservedRunningTime="2025-12-01 10:47:18.764440476 +0000 UTC m=+955.998911374" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.791526 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.096134758 podStartE2EDuration="27.791506095s" podCreationTimestamp="2025-12-01 10:46:51 +0000 UTC" firstStartedPulling="2025-12-01 10:47:03.143601719 +0000 UTC m=+940.378072617" lastFinishedPulling="2025-12-01 10:47:17.838973066 +0000 UTC m=+955.073443954" observedRunningTime="2025-12-01 10:47:18.78515334 +0000 UTC m=+956.019624258" watchObservedRunningTime="2025-12-01 10:47:18.791506095 +0000 UTC m=+956.025977003" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.819620 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.392743052 podStartE2EDuration="24.819595167s" podCreationTimestamp="2025-12-01 10:46:54 +0000 UTC" firstStartedPulling="2025-12-01 10:47:03.567428215 +0000 UTC m=+940.801899113" lastFinishedPulling="2025-12-01 10:47:17.99428033 +0000 UTC m=+955.228751228" observedRunningTime="2025-12-01 10:47:18.81197654 +0000 UTC m=+956.046447468" watchObservedRunningTime="2025-12-01 10:47:18.819595167 +0000 UTC m=+956.054066055" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.839061 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gr6sc" podStartSLOduration=6.50028088 podStartE2EDuration="10.839034979s" podCreationTimestamp="2025-12-01 10:47:08 +0000 UTC" firstStartedPulling="2025-12-01 10:47:13.607864933 +0000 UTC m=+950.842335831" lastFinishedPulling="2025-12-01 10:47:17.946619012 +0000 UTC m=+955.181089930" observedRunningTime="2025-12-01 10:47:18.832183537 +0000 UTC m=+956.066654445" watchObservedRunningTime="2025-12-01 10:47:18.839034979 +0000 UTC m=+956.073505887" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.845664 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.845725 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:18 crc kubenswrapper[4909]: I1201 10:47:18.855816 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.266951934 podStartE2EDuration="35.855796023s" podCreationTimestamp="2025-12-01 10:46:43 +0000 UTC" firstStartedPulling="2025-12-01 10:47:02.965835184 +0000 UTC m=+940.200306082" lastFinishedPulling="2025-12-01 10:47:11.554679273 +0000 UTC m=+948.789150171" observedRunningTime="2025-12-01 10:47:18.855506454 +0000 UTC m=+956.089977362" watchObservedRunningTime="2025-12-01 10:47:18.855796023 +0000 UTC m=+956.090266921" Dec 01 10:47:19 crc kubenswrapper[4909]: I1201 10:47:19.806725 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 10:47:19 crc kubenswrapper[4909]: I1201 10:47:19.848007 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 10:47:19 crc kubenswrapper[4909]: I1201 10:47:19.891178 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gr6sc" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="registry-server" probeResult="failure" output=< Dec 01 10:47:19 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Dec 01 10:47:19 crc kubenswrapper[4909]: > Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.362427 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.399796 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.758864 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.758915 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.794927 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 10:47:20 crc kubenswrapper[4909]: I1201 10:47:20.795449 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.065408 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:21 crc kubenswrapper[4909]: E1201 10:47:21.065850 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="init" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.065865 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="init" Dec 01 10:47:21 crc kubenswrapper[4909]: E1201 10:47:21.065899 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="dnsmasq-dns" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.065907 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="dnsmasq-dns" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.066083 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef13416-f63b-4949-bf3e-83e9b95c58fc" containerName="dnsmasq-dns" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.067373 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.071637 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.101932 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.139833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.140033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.140169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nj7\" (UniqueName: \"kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.140628 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.242344 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.242398 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.242443 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.242474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nj7\" (UniqueName: \"kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.243945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.244625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.245320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.279069 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.281171 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.287466 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-n9mhh" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.287780 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.287976 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.288204 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.309254 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nj7\" (UniqueName: \"kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7\") pod \"dnsmasq-dns-7f896c8c65-qzl29\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.321842 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.344096 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347039 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1856063-0b40-4fee-ab30-024128a88da8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347369 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8f7\" (UniqueName: \"kubernetes.io/projected/b1856063-0b40-4fee-ab30-024128a88da8-kube-api-access-gv8f7\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347462 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-scripts\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.347518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-config\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.363670 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.364553 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.401922 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mq7gt"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.409259 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.435783 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovs-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462149 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8f7\" (UniqueName: \"kubernetes.io/projected/b1856063-0b40-4fee-ab30-024128a88da8-kube-api-access-gv8f7\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462165 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovn-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-scripts\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-config\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462285 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd56v\" (UniqueName: \"kubernetes.io/projected/5dbfc51e-589a-43e5-805d-e5856f361b43-kube-api-access-rd56v\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462353 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbfc51e-589a-43e5-805d-e5856f361b43-config\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462381 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-combined-ca-bundle\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.462399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1856063-0b40-4fee-ab30-024128a88da8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.463096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1856063-0b40-4fee-ab30-024128a88da8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.472525 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mq7gt"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.482509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-scripts\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.483090 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1856063-0b40-4fee-ab30-024128a88da8-config\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.493404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.495019 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.509219 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.511654 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.513942 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8f7\" (UniqueName: \"kubernetes.io/projected/b1856063-0b40-4fee-ab30-024128a88da8-kube-api-access-gv8f7\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.516604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1856063-0b40-4fee-ab30-024128a88da8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b1856063-0b40-4fee-ab30-024128a88da8\") " pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.522857 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.525392 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.557538 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565527 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565585 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovs-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565633 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovn-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565658 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd56v\" (UniqueName: \"kubernetes.io/projected/5dbfc51e-589a-43e5-805d-e5856f361b43-kube-api-access-rd56v\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565715 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565734 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbfc51e-589a-43e5-805d-e5856f361b43-config\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565755 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd2n2\" (UniqueName: \"kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.565786 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-combined-ca-bundle\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.566501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovn-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.568587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5dbfc51e-589a-43e5-805d-e5856f361b43-ovs-rundir\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.568594 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbfc51e-589a-43e5-805d-e5856f361b43-config\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.571107 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-combined-ca-bundle\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.614375 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbfc51e-589a-43e5-805d-e5856f361b43-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.617553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd56v\" (UniqueName: \"kubernetes.io/projected/5dbfc51e-589a-43e5-805d-e5856f361b43-kube-api-access-rd56v\") pod \"ovn-controller-metrics-mq7gt\" (UID: \"5dbfc51e-589a-43e5-805d-e5856f361b43\") " pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.664512 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.669137 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.669229 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd2n2\" (UniqueName: \"kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.669321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.669347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.669420 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.677851 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.679988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.680094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.680130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.698035 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd2n2\" (UniqueName: \"kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2\") pod \"dnsmasq-dns-86db49b7ff-7vlz2\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.918455 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mq7gt" Dec 01 10:47:21 crc kubenswrapper[4909]: I1201 10:47:21.944443 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.097550 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.171683 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:22 crc kubenswrapper[4909]: W1201 10:47:22.504842 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba361527_47be_4c33_a659_f90cdabb757c.slice/crio-362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e WatchSource:0}: Error finding container 362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e: Status 404 returned error can't find the container with id 362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.506598 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.587337 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mq7gt"] Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.801531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1856063-0b40-4fee-ab30-024128a88da8","Type":"ContainerStarted","Data":"29b398930ecb959fd03ba310cd47c65199e59bf0b228be72186285cf3d7e4fbf"} Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.804634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerStarted","Data":"1fd7a3d90828bd95639b5ee25e3a089db7f86661ebc36922f2c526dfbe60a8c7"} Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.804692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerStarted","Data":"362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e"} Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.807450 4909 generic.go:334] "Generic (PLEG): container finished" podID="c950b7d2-7fcc-482a-a890-cf70d1022cc6" containerID="6c85dc16900fd502c6a64a3f602a7c7545560bb77a79b9ed155033ca6969d483" exitCode=0 Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.807524 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" event={"ID":"c950b7d2-7fcc-482a-a890-cf70d1022cc6","Type":"ContainerDied","Data":"6c85dc16900fd502c6a64a3f602a7c7545560bb77a79b9ed155033ca6969d483"} Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.807545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" event={"ID":"c950b7d2-7fcc-482a-a890-cf70d1022cc6","Type":"ContainerStarted","Data":"44607080d7ba927f85e1ee13d91b0da94fd190965b995bae93d25db27e045e8f"} Dec 01 10:47:22 crc kubenswrapper[4909]: I1201 10:47:22.808830 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mq7gt" event={"ID":"5dbfc51e-589a-43e5-805d-e5856f361b43","Type":"ContainerStarted","Data":"148e469fd2bf89f8bc74b80e86ed868fb70f14c197440562678af3d66c5569b9"} Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.500368 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.538406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc\") pod \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.538599 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nj7\" (UniqueName: \"kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7\") pod \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.538697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb\") pod \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.538763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config\") pod \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\" (UID: \"c950b7d2-7fcc-482a-a890-cf70d1022cc6\") " Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.547273 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7" (OuterVolumeSpecName: "kube-api-access-q6nj7") pod "c950b7d2-7fcc-482a-a890-cf70d1022cc6" (UID: "c950b7d2-7fcc-482a-a890-cf70d1022cc6"). InnerVolumeSpecName "kube-api-access-q6nj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.560032 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c950b7d2-7fcc-482a-a890-cf70d1022cc6" (UID: "c950b7d2-7fcc-482a-a890-cf70d1022cc6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.561818 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config" (OuterVolumeSpecName: "config") pod "c950b7d2-7fcc-482a-a890-cf70d1022cc6" (UID: "c950b7d2-7fcc-482a-a890-cf70d1022cc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.571525 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c950b7d2-7fcc-482a-a890-cf70d1022cc6" (UID: "c950b7d2-7fcc-482a-a890-cf70d1022cc6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.642116 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nj7\" (UniqueName: \"kubernetes.io/projected/c950b7d2-7fcc-482a-a890-cf70d1022cc6-kube-api-access-q6nj7\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.642149 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.642159 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.642168 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c950b7d2-7fcc-482a-a890-cf70d1022cc6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.831569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mq7gt" event={"ID":"5dbfc51e-589a-43e5-805d-e5856f361b43","Type":"ContainerStarted","Data":"e27492bc1a93c50709d4bac8ee5b146e195f7269c78777585bff8a096d2541cc"} Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.835154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1856063-0b40-4fee-ab30-024128a88da8","Type":"ContainerStarted","Data":"ea9e682ebf54fb50eaaf32bf6fcc4ae2cbb744cdb4ae466216bd9113487bf684"} Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.836645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" event={"ID":"c950b7d2-7fcc-482a-a890-cf70d1022cc6","Type":"ContainerDied","Data":"44607080d7ba927f85e1ee13d91b0da94fd190965b995bae93d25db27e045e8f"} Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.836696 4909 scope.go:117] "RemoveContainer" containerID="6c85dc16900fd502c6a64a3f602a7c7545560bb77a79b9ed155033ca6969d483" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.836839 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qzl29" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.843019 4909 generic.go:334] "Generic (PLEG): container finished" podID="ba361527-47be-4c33-a659-f90cdabb757c" containerID="1fd7a3d90828bd95639b5ee25e3a089db7f86661ebc36922f2c526dfbe60a8c7" exitCode=0 Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.843089 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerDied","Data":"1fd7a3d90828bd95639b5ee25e3a089db7f86661ebc36922f2c526dfbe60a8c7"} Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.878957 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mq7gt" podStartSLOduration=2.878931042 podStartE2EDuration="2.878931042s" podCreationTimestamp="2025-12-01 10:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:47:23.851528492 +0000 UTC m=+961.085999390" watchObservedRunningTime="2025-12-01 10:47:23.878931042 +0000 UTC m=+961.113401940" Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.975119 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:23 crc kubenswrapper[4909]: I1201 10:47:23.988754 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qzl29"] Dec 01 10:47:24 crc kubenswrapper[4909]: I1201 10:47:24.852729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b1856063-0b40-4fee-ab30-024128a88da8","Type":"ContainerStarted","Data":"b5c43f3e945163d97564e7463a182107e9d7469b1f9b2908e9fda52c50866a05"} Dec 01 10:47:24 crc kubenswrapper[4909]: I1201 10:47:24.853485 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 10:47:24 crc kubenswrapper[4909]: I1201 10:47:24.880183 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.566147162 podStartE2EDuration="3.880154163s" podCreationTimestamp="2025-12-01 10:47:21 +0000 UTC" firstStartedPulling="2025-12-01 10:47:22.109710875 +0000 UTC m=+959.344181773" lastFinishedPulling="2025-12-01 10:47:23.423717876 +0000 UTC m=+960.658188774" observedRunningTime="2025-12-01 10:47:24.877749555 +0000 UTC m=+962.112220453" watchObservedRunningTime="2025-12-01 10:47:24.880154163 +0000 UTC m=+962.114625061" Dec 01 10:47:24 crc kubenswrapper[4909]: I1201 10:47:24.915554 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 10:47:24 crc kubenswrapper[4909]: I1201 10:47:24.915638 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 10:47:25 crc kubenswrapper[4909]: I1201 10:47:25.269748 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c950b7d2-7fcc-482a-a890-cf70d1022cc6" path="/var/lib/kubelet/pods/c950b7d2-7fcc-482a-a890-cf70d1022cc6/volumes" Dec 01 10:47:26 crc kubenswrapper[4909]: I1201 10:47:26.298249 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 10:47:26 crc kubenswrapper[4909]: I1201 10:47:26.298592 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 10:47:28 crc kubenswrapper[4909]: I1201 10:47:28.888786 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerStarted","Data":"beeec8123cfad4adc9e251a33355dd05a7a2192e83ad3f4b7a376ed36474ce5c"} Dec 01 10:47:28 crc kubenswrapper[4909]: I1201 10:47:28.908390 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:28 crc kubenswrapper[4909]: I1201 10:47:28.949292 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:29 crc kubenswrapper[4909]: I1201 10:47:29.148318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:29 crc kubenswrapper[4909]: I1201 10:47:29.778055 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 10:47:29 crc kubenswrapper[4909]: I1201 10:47:29.859268 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 10:47:29 crc kubenswrapper[4909]: I1201 10:47:29.897740 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:29 crc kubenswrapper[4909]: I1201 10:47:29.926136 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" podStartSLOduration=8.926112863 podStartE2EDuration="8.926112863s" podCreationTimestamp="2025-12-01 10:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:47:29.925575555 +0000 UTC m=+967.160046453" watchObservedRunningTime="2025-12-01 10:47:29.926112863 +0000 UTC m=+967.160583761" Dec 01 10:47:30 crc kubenswrapper[4909]: I1201 10:47:30.040401 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 10:47:30 crc kubenswrapper[4909]: I1201 10:47:30.128790 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 10:47:30 crc kubenswrapper[4909]: I1201 10:47:30.905128 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gr6sc" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="registry-server" containerID="cri-o://c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd" gracePeriod=2 Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.514693 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.600446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content\") pod \"103f4624-c750-4f9f-95c8-38dee6c94e77\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.600695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4888\" (UniqueName: \"kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888\") pod \"103f4624-c750-4f9f-95c8-38dee6c94e77\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.600832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities\") pod \"103f4624-c750-4f9f-95c8-38dee6c94e77\" (UID: \"103f4624-c750-4f9f-95c8-38dee6c94e77\") " Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.602026 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities" (OuterVolumeSpecName: "utilities") pod "103f4624-c750-4f9f-95c8-38dee6c94e77" (UID: "103f4624-c750-4f9f-95c8-38dee6c94e77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.611130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888" (OuterVolumeSpecName: "kube-api-access-t4888") pod "103f4624-c750-4f9f-95c8-38dee6c94e77" (UID: "103f4624-c750-4f9f-95c8-38dee6c94e77"). InnerVolumeSpecName "kube-api-access-t4888". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.653172 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "103f4624-c750-4f9f-95c8-38dee6c94e77" (UID: "103f4624-c750-4f9f-95c8-38dee6c94e77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.703314 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.703383 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103f4624-c750-4f9f-95c8-38dee6c94e77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.703401 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4888\" (UniqueName: \"kubernetes.io/projected/103f4624-c750-4f9f-95c8-38dee6c94e77-kube-api-access-t4888\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.921941 4909 generic.go:334] "Generic (PLEG): container finished" podID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerID="c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd" exitCode=0 Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.921991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerDied","Data":"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd"} Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.922033 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gr6sc" event={"ID":"103f4624-c750-4f9f-95c8-38dee6c94e77","Type":"ContainerDied","Data":"f2782dd79c25069c2f684a02b8243e28e4c4de717f9f7ab90005cc67528d6a57"} Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.922051 4909 scope.go:117] "RemoveContainer" containerID="c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.922205 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gr6sc" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.952955 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2905-account-create-update-v7wdd"] Dec 01 10:47:31 crc kubenswrapper[4909]: E1201 10:47:31.953397 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="registry-server" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953416 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="registry-server" Dec 01 10:47:31 crc kubenswrapper[4909]: E1201 10:47:31.953440 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="extract-utilities" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953448 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="extract-utilities" Dec 01 10:47:31 crc kubenswrapper[4909]: E1201 10:47:31.953463 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c950b7d2-7fcc-482a-a890-cf70d1022cc6" containerName="init" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953469 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c950b7d2-7fcc-482a-a890-cf70d1022cc6" containerName="init" Dec 01 10:47:31 crc kubenswrapper[4909]: E1201 10:47:31.953482 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="extract-content" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953489 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="extract-content" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953673 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c950b7d2-7fcc-482a-a890-cf70d1022cc6" containerName="init" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.953687 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" containerName="registry-server" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.954345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.961723 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.964663 4909 scope.go:117] "RemoveContainer" containerID="fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.978971 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hlldf"] Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.980899 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlldf" Dec 01 10:47:31 crc kubenswrapper[4909]: I1201 10:47:31.998027 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hlldf"] Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.007096 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2905-account-create-update-v7wdd"] Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.011270 4909 scope.go:117] "RemoveContainer" containerID="9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.014352 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.020720 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gr6sc"] Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.053445 4909 scope.go:117] "RemoveContainer" containerID="c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd" Dec 01 10:47:32 crc kubenswrapper[4909]: E1201 10:47:32.054214 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd\": container with ID starting with c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd not found: ID does not exist" containerID="c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.054273 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd"} err="failed to get container status \"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd\": rpc error: code = NotFound desc = could not find container \"c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd\": container with ID starting with c31e488d4009936f556a0395049b1b045d5a9585ada8453863543f33846f4fbd not found: ID does not exist" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.054309 4909 scope.go:117] "RemoveContainer" containerID="fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0" Dec 01 10:47:32 crc kubenswrapper[4909]: E1201 10:47:32.055231 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0\": container with ID starting with fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0 not found: ID does not exist" containerID="fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.055279 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0"} err="failed to get container status \"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0\": rpc error: code = NotFound desc = could not find container \"fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0\": container with ID starting with fe17580222926fca6d9e55697c257add42a18217925e8822edeaf724ff0d9ee0 not found: ID does not exist" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.055309 4909 scope.go:117] "RemoveContainer" containerID="9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3" Dec 01 10:47:32 crc kubenswrapper[4909]: E1201 10:47:32.056536 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3\": container with ID starting with 9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3 not found: ID does not exist" containerID="9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.056570 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3"} err="failed to get container status \"9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3\": rpc error: code = NotFound desc = could not find container \"9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3\": container with ID starting with 9772df0d1dc544dad54951fba69757249e48c919461538c7ca32396b06f59ea3 not found: ID does not exist" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.111945 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.112137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.112180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpfn\" (UniqueName: \"kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.112218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbvk\" (UniqueName: \"kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.214313 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.214379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpfn\" (UniqueName: \"kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.214416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbvk\" (UniqueName: \"kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.214486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.215846 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.216157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.241061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbvk\" (UniqueName: \"kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk\") pod \"glance-db-create-hlldf\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.241289 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpfn\" (UniqueName: \"kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn\") pod \"glance-2905-account-create-update-v7wdd\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.287196 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.345308 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlldf" Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.771399 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2905-account-create-update-v7wdd"] Dec 01 10:47:32 crc kubenswrapper[4909]: W1201 10:47:32.775697 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod411e6d91_2c4d_44a0_94ed_09347706bc05.slice/crio-ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f WatchSource:0}: Error finding container ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f: Status 404 returned error can't find the container with id ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.896384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hlldf"] Dec 01 10:47:32 crc kubenswrapper[4909]: W1201 10:47:32.900753 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod384e1871_1564_43db_87f7_522394755854.slice/crio-3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f WatchSource:0}: Error finding container 3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f: Status 404 returned error can't find the container with id 3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.941325 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlldf" event={"ID":"384e1871-1564-43db-87f7-522394755854","Type":"ContainerStarted","Data":"3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f"} Dec 01 10:47:32 crc kubenswrapper[4909]: I1201 10:47:32.942663 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2905-account-create-update-v7wdd" event={"ID":"411e6d91-2c4d-44a0-94ed-09347706bc05","Type":"ContainerStarted","Data":"ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f"} Dec 01 10:47:33 crc kubenswrapper[4909]: I1201 10:47:33.267282 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103f4624-c750-4f9f-95c8-38dee6c94e77" path="/var/lib/kubelet/pods/103f4624-c750-4f9f-95c8-38dee6c94e77/volumes" Dec 01 10:47:33 crc kubenswrapper[4909]: I1201 10:47:33.956583 4909 generic.go:334] "Generic (PLEG): container finished" podID="384e1871-1564-43db-87f7-522394755854" containerID="c2cfc4149410bf7412755a0a3ec28e44bac6a0bd97ef8ec5c1164b1ed29f18fa" exitCode=0 Dec 01 10:47:33 crc kubenswrapper[4909]: I1201 10:47:33.956651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlldf" event={"ID":"384e1871-1564-43db-87f7-522394755854","Type":"ContainerDied","Data":"c2cfc4149410bf7412755a0a3ec28e44bac6a0bd97ef8ec5c1164b1ed29f18fa"} Dec 01 10:47:33 crc kubenswrapper[4909]: I1201 10:47:33.958862 4909 generic.go:334] "Generic (PLEG): container finished" podID="411e6d91-2c4d-44a0-94ed-09347706bc05" containerID="bc7a3ac1ad562ff021dd6fb836dfaf3d6e6849078d152d1703cafc764afe29e5" exitCode=0 Dec 01 10:47:33 crc kubenswrapper[4909]: I1201 10:47:33.958902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2905-account-create-update-v7wdd" event={"ID":"411e6d91-2c4d-44a0-94ed-09347706bc05","Type":"ContainerDied","Data":"bc7a3ac1ad562ff021dd6fb836dfaf3d6e6849078d152d1703cafc764afe29e5"} Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.361352 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.447168 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlldf" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.470235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts\") pod \"411e6d91-2c4d-44a0-94ed-09347706bc05\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.470294 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grpfn\" (UniqueName: \"kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn\") pod \"411e6d91-2c4d-44a0-94ed-09347706bc05\" (UID: \"411e6d91-2c4d-44a0-94ed-09347706bc05\") " Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.471175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "411e6d91-2c4d-44a0-94ed-09347706bc05" (UID: "411e6d91-2c4d-44a0-94ed-09347706bc05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.475842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn" (OuterVolumeSpecName: "kube-api-access-grpfn") pod "411e6d91-2c4d-44a0-94ed-09347706bc05" (UID: "411e6d91-2c4d-44a0-94ed-09347706bc05"). InnerVolumeSpecName "kube-api-access-grpfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.572297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwbvk\" (UniqueName: \"kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk\") pod \"384e1871-1564-43db-87f7-522394755854\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.572484 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts\") pod \"384e1871-1564-43db-87f7-522394755854\" (UID: \"384e1871-1564-43db-87f7-522394755854\") " Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.573039 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grpfn\" (UniqueName: \"kubernetes.io/projected/411e6d91-2c4d-44a0-94ed-09347706bc05-kube-api-access-grpfn\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.573066 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e6d91-2c4d-44a0-94ed-09347706bc05-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.573103 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "384e1871-1564-43db-87f7-522394755854" (UID: "384e1871-1564-43db-87f7-522394755854"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.575611 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk" (OuterVolumeSpecName: "kube-api-access-jwbvk") pod "384e1871-1564-43db-87f7-522394755854" (UID: "384e1871-1564-43db-87f7-522394755854"). InnerVolumeSpecName "kube-api-access-jwbvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.674727 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwbvk\" (UniqueName: \"kubernetes.io/projected/384e1871-1564-43db-87f7-522394755854-kube-api-access-jwbvk\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.674775 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/384e1871-1564-43db-87f7-522394755854-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.977087 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hlldf" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.977084 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hlldf" event={"ID":"384e1871-1564-43db-87f7-522394755854","Type":"ContainerDied","Data":"3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f"} Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.977220 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9c648d46932b48b4cf1426ad4d86767512bb361f90c6743cf2dff34a7c196f" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.979005 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2905-account-create-update-v7wdd" event={"ID":"411e6d91-2c4d-44a0-94ed-09347706bc05","Type":"ContainerDied","Data":"ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f"} Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.979034 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca5a9867fc4bd17fd7ae576fc7612890cac7b259fc02354455fc9de81572ba6f" Dec 01 10:47:35 crc kubenswrapper[4909]: I1201 10:47:35.979177 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2905-account-create-update-v7wdd" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.122661 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jt8fv"] Dec 01 10:47:36 crc kubenswrapper[4909]: E1201 10:47:36.123102 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384e1871-1564-43db-87f7-522394755854" containerName="mariadb-database-create" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.123118 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="384e1871-1564-43db-87f7-522394755854" containerName="mariadb-database-create" Dec 01 10:47:36 crc kubenswrapper[4909]: E1201 10:47:36.123133 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411e6d91-2c4d-44a0-94ed-09347706bc05" containerName="mariadb-account-create-update" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.123140 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="411e6d91-2c4d-44a0-94ed-09347706bc05" containerName="mariadb-account-create-update" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.123300 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="411e6d91-2c4d-44a0-94ed-09347706bc05" containerName="mariadb-account-create-update" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.123317 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="384e1871-1564-43db-87f7-522394755854" containerName="mariadb-database-create" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.123964 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.130036 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jt8fv"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.180774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jvh6\" (UniqueName: \"kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.180944 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.226075 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4143-account-create-update-9v4tm"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.232282 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.233285 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4143-account-create-update-9v4tm"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.245452 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.282524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jvh6\" (UniqueName: \"kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.282632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92d6\" (UniqueName: \"kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.282720 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.282745 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.283986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.304745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jvh6\" (UniqueName: \"kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6\") pod \"keystone-db-create-jt8fv\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.384540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92d6\" (UniqueName: \"kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.384625 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.385469 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.407124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92d6\" (UniqueName: \"kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6\") pod \"keystone-4143-account-create-update-9v4tm\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.493253 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.511165 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p56hk"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.512723 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.520279 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p56hk"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.560188 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.592371 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.592586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6hv\" (UniqueName: \"kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.614040 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-def2-account-create-update-4n7lx"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.615293 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.629584 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.640758 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-def2-account-create-update-4n7lx"] Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.696324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.696377 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppb2r\" (UniqueName: \"kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.696800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.697034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6hv\" (UniqueName: \"kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.698020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.722639 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6hv\" (UniqueName: \"kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv\") pod \"placement-db-create-p56hk\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.743648 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.799373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.799430 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppb2r\" (UniqueName: \"kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.800302 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.819419 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppb2r\" (UniqueName: \"kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r\") pod \"placement-def2-account-create-update-4n7lx\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.917616 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p56hk" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.947530 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:47:36 crc kubenswrapper[4909]: I1201 10:47:36.949684 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.025859 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.026278 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="dnsmasq-dns" containerID="cri-o://695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60" gracePeriod=10 Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.048153 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jt8fv"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.138744 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4143-account-create-update-9v4tm"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.178252 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xx8p2"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.183564 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.191561 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tmkwf" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.192346 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 10:47:37 crc kubenswrapper[4909]: W1201 10:47:37.193216 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f21bdf9_c58c_4b57_b3fe_f64338c83e32.slice/crio-84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f WatchSource:0}: Error finding container 84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f: Status 404 returned error can't find the container with id 84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.207404 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.207459 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5rr\" (UniqueName: \"kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.207508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.207629 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.210446 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xx8p2"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.309684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.310523 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.310681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.310709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5rr\" (UniqueName: \"kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.316128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.316664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.316669 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.331094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5rr\" (UniqueName: \"kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr\") pod \"glance-db-sync-xx8p2\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.334803 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.487125 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p56hk"] Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.511995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx8p2" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.655053 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.688850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-def2-account-create-update-4n7lx"] Dec 01 10:47:37 crc kubenswrapper[4909]: W1201 10:47:37.718508 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f9510b7_d3ab_4508_bf7e_968d2d3b684c.slice/crio-ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba WatchSource:0}: Error finding container ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba: Status 404 returned error can't find the container with id ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.718749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc\") pod \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.794166 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2ecb921-0356-4a1d-93dd-1fb91e41e081" (UID: "e2ecb921-0356-4a1d-93dd-1fb91e41e081"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.821074 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mxt2\" (UniqueName: \"kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2\") pod \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.823142 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config\") pod \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\" (UID: \"e2ecb921-0356-4a1d-93dd-1fb91e41e081\") " Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.823744 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.832337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2" (OuterVolumeSpecName: "kube-api-access-9mxt2") pod "e2ecb921-0356-4a1d-93dd-1fb91e41e081" (UID: "e2ecb921-0356-4a1d-93dd-1fb91e41e081"). InnerVolumeSpecName "kube-api-access-9mxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.887706 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config" (OuterVolumeSpecName: "config") pod "e2ecb921-0356-4a1d-93dd-1fb91e41e081" (UID: "e2ecb921-0356-4a1d-93dd-1fb91e41e081"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.924500 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2ecb921-0356-4a1d-93dd-1fb91e41e081-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:37 crc kubenswrapper[4909]: I1201 10:47:37.924547 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mxt2\" (UniqueName: \"kubernetes.io/projected/e2ecb921-0356-4a1d-93dd-1fb91e41e081-kube-api-access-9mxt2\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.016647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-def2-account-create-update-4n7lx" event={"ID":"9f9510b7-d3ab-4508-bf7e-968d2d3b684c","Type":"ContainerStarted","Data":"230179b68886a457a25985c3f4a52bfde6e833e8502b7cd22167b8ecea19df80"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.016711 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-def2-account-create-update-4n7lx" event={"ID":"9f9510b7-d3ab-4508-bf7e-968d2d3b684c","Type":"ContainerStarted","Data":"ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.019237 4909 generic.go:334] "Generic (PLEG): container finished" podID="8f21bdf9-c58c-4b57-b3fe-f64338c83e32" containerID="56371d0bb45f86268ad7522af8866f1d19b11fc7d8500014660c74b3dea75f46" exitCode=0 Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.019328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4143-account-create-update-9v4tm" event={"ID":"8f21bdf9-c58c-4b57-b3fe-f64338c83e32","Type":"ContainerDied","Data":"56371d0bb45f86268ad7522af8866f1d19b11fc7d8500014660c74b3dea75f46"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.019383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4143-account-create-update-9v4tm" event={"ID":"8f21bdf9-c58c-4b57-b3fe-f64338c83e32","Type":"ContainerStarted","Data":"84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.021323 4909 generic.go:334] "Generic (PLEG): container finished" podID="33db1774-4b90-4cba-be63-744f8b79f29c" containerID="1f91f3690f8e907ac02cdf05720ddd9d69a17399820d9ea258c556c4198a9585" exitCode=0 Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.021462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jt8fv" event={"ID":"33db1774-4b90-4cba-be63-744f8b79f29c","Type":"ContainerDied","Data":"1f91f3690f8e907ac02cdf05720ddd9d69a17399820d9ea258c556c4198a9585"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.021512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jt8fv" event={"ID":"33db1774-4b90-4cba-be63-744f8b79f29c","Type":"ContainerStarted","Data":"fbdea37aed2ab52400a8735d6dc697e1e49883e1193267bf587c10aed8dbf44d"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.024679 4909 generic.go:334] "Generic (PLEG): container finished" podID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerID="695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60" exitCode=0 Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.024779 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.024813 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" event={"ID":"e2ecb921-0356-4a1d-93dd-1fb91e41e081","Type":"ContainerDied","Data":"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.024856 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sl7td" event={"ID":"e2ecb921-0356-4a1d-93dd-1fb91e41e081","Type":"ContainerDied","Data":"e254f8739b11037302726b36760e5c1d60fd38cf830e73eeda201d079043cee1"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.024894 4909 scope.go:117] "RemoveContainer" containerID="695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.031539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p56hk" event={"ID":"af2c338d-8319-457b-ad3b-cbf51df55d8a","Type":"ContainerStarted","Data":"d54eb1d388ec8d4fd2cb7eca3e3eea821fabdbee351d5546657d71b37b528015"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.031567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p56hk" event={"ID":"af2c338d-8319-457b-ad3b-cbf51df55d8a","Type":"ContainerStarted","Data":"780d1743ca4417af0a02645f5ff2e32d3362656af5964518f7f3488a6581608c"} Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.057012 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-def2-account-create-update-4n7lx" podStartSLOduration=2.056969226 podStartE2EDuration="2.056969226s" podCreationTimestamp="2025-12-01 10:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:47:38.042462404 +0000 UTC m=+975.276933312" watchObservedRunningTime="2025-12-01 10:47:38.056969226 +0000 UTC m=+975.291440124" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.085815 4909 scope.go:117] "RemoveContainer" containerID="9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.138391 4909 scope.go:117] "RemoveContainer" containerID="695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60" Dec 01 10:47:38 crc kubenswrapper[4909]: E1201 10:47:38.139105 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60\": container with ID starting with 695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60 not found: ID does not exist" containerID="695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.139159 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60"} err="failed to get container status \"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60\": rpc error: code = NotFound desc = could not find container \"695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60\": container with ID starting with 695a6f1b16e98de141bcf864291d8e68b645eccf3c8886650c68151e2d8d6c60 not found: ID does not exist" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.139203 4909 scope.go:117] "RemoveContainer" containerID="9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608" Dec 01 10:47:38 crc kubenswrapper[4909]: E1201 10:47:38.139546 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608\": container with ID starting with 9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608 not found: ID does not exist" containerID="9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.139576 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608"} err="failed to get container status \"9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608\": rpc error: code = NotFound desc = could not find container \"9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608\": container with ID starting with 9e2a460a6ca50d1865e83a2fa1f089e0c3eff121aebcbab8268da127cebb3608 not found: ID does not exist" Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.139773 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.147770 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sl7td"] Dec 01 10:47:38 crc kubenswrapper[4909]: I1201 10:47:38.165796 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xx8p2"] Dec 01 10:47:38 crc kubenswrapper[4909]: W1201 10:47:38.175888 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e80149e_1959_4ae7_a8b3_41fc91f45121.slice/crio-7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c WatchSource:0}: Error finding container 7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c: Status 404 returned error can't find the container with id 7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.047478 4909 generic.go:334] "Generic (PLEG): container finished" podID="af2c338d-8319-457b-ad3b-cbf51df55d8a" containerID="d54eb1d388ec8d4fd2cb7eca3e3eea821fabdbee351d5546657d71b37b528015" exitCode=0 Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.047613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p56hk" event={"ID":"af2c338d-8319-457b-ad3b-cbf51df55d8a","Type":"ContainerDied","Data":"d54eb1d388ec8d4fd2cb7eca3e3eea821fabdbee351d5546657d71b37b528015"} Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.051045 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx8p2" event={"ID":"2e80149e-1959-4ae7-a8b3-41fc91f45121","Type":"ContainerStarted","Data":"7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c"} Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.053284 4909 generic.go:334] "Generic (PLEG): container finished" podID="9f9510b7-d3ab-4508-bf7e-968d2d3b684c" containerID="230179b68886a457a25985c3f4a52bfde6e833e8502b7cd22167b8ecea19df80" exitCode=0 Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.053396 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-def2-account-create-update-4n7lx" event={"ID":"9f9510b7-d3ab-4508-bf7e-968d2d3b684c","Type":"ContainerDied","Data":"230179b68886a457a25985c3f4a52bfde6e833e8502b7cd22167b8ecea19df80"} Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.285487 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" path="/var/lib/kubelet/pods/e2ecb921-0356-4a1d-93dd-1fb91e41e081/volumes" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.461369 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.558446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jvh6\" (UniqueName: \"kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6\") pod \"33db1774-4b90-4cba-be63-744f8b79f29c\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.558784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts\") pod \"33db1774-4b90-4cba-be63-744f8b79f29c\" (UID: \"33db1774-4b90-4cba-be63-744f8b79f29c\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.559676 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33db1774-4b90-4cba-be63-744f8b79f29c" (UID: "33db1774-4b90-4cba-be63-744f8b79f29c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.564546 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6" (OuterVolumeSpecName: "kube-api-access-6jvh6") pod "33db1774-4b90-4cba-be63-744f8b79f29c" (UID: "33db1774-4b90-4cba-be63-744f8b79f29c"). InnerVolumeSpecName "kube-api-access-6jvh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.593755 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.599326 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p56hk" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.659792 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts\") pod \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660060 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6hv\" (UniqueName: \"kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv\") pod \"af2c338d-8319-457b-ad3b-cbf51df55d8a\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660122 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts\") pod \"af2c338d-8319-457b-ad3b-cbf51df55d8a\" (UID: \"af2c338d-8319-457b-ad3b-cbf51df55d8a\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92d6\" (UniqueName: \"kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6\") pod \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\" (UID: \"8f21bdf9-c58c-4b57-b3fe-f64338c83e32\") " Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660682 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jvh6\" (UniqueName: \"kubernetes.io/projected/33db1774-4b90-4cba-be63-744f8b79f29c-kube-api-access-6jvh6\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660702 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33db1774-4b90-4cba-be63-744f8b79f29c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660674 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f21bdf9-c58c-4b57-b3fe-f64338c83e32" (UID: "8f21bdf9-c58c-4b57-b3fe-f64338c83e32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.660814 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af2c338d-8319-457b-ad3b-cbf51df55d8a" (UID: "af2c338d-8319-457b-ad3b-cbf51df55d8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.666063 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6" (OuterVolumeSpecName: "kube-api-access-m92d6") pod "8f21bdf9-c58c-4b57-b3fe-f64338c83e32" (UID: "8f21bdf9-c58c-4b57-b3fe-f64338c83e32"). InnerVolumeSpecName "kube-api-access-m92d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.666480 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv" (OuterVolumeSpecName: "kube-api-access-pm6hv") pod "af2c338d-8319-457b-ad3b-cbf51df55d8a" (UID: "af2c338d-8319-457b-ad3b-cbf51df55d8a"). InnerVolumeSpecName "kube-api-access-pm6hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.762721 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af2c338d-8319-457b-ad3b-cbf51df55d8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.765705 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92d6\" (UniqueName: \"kubernetes.io/projected/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-kube-api-access-m92d6\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.765937 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f21bdf9-c58c-4b57-b3fe-f64338c83e32-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:39 crc kubenswrapper[4909]: I1201 10:47:39.766195 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6hv\" (UniqueName: \"kubernetes.io/projected/af2c338d-8319-457b-ad3b-cbf51df55d8a-kube-api-access-pm6hv\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.067136 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4143-account-create-update-9v4tm" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.067155 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4143-account-create-update-9v4tm" event={"ID":"8f21bdf9-c58c-4b57-b3fe-f64338c83e32","Type":"ContainerDied","Data":"84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f"} Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.067217 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84145e321db04dc5690b6ca6a203d1a966c5f1442abb510f1c979e532e6a7f1f" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.069704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jt8fv" event={"ID":"33db1774-4b90-4cba-be63-744f8b79f29c","Type":"ContainerDied","Data":"fbdea37aed2ab52400a8735d6dc697e1e49883e1193267bf587c10aed8dbf44d"} Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.069735 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbdea37aed2ab52400a8735d6dc697e1e49883e1193267bf587c10aed8dbf44d" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.069732 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jt8fv" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.071422 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p56hk" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.071411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p56hk" event={"ID":"af2c338d-8319-457b-ad3b-cbf51df55d8a","Type":"ContainerDied","Data":"780d1743ca4417af0a02645f5ff2e32d3362656af5964518f7f3488a6581608c"} Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.071651 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="780d1743ca4417af0a02645f5ff2e32d3362656af5964518f7f3488a6581608c" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.341604 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.376640 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppb2r\" (UniqueName: \"kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r\") pod \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.376826 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts\") pod \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\" (UID: \"9f9510b7-d3ab-4508-bf7e-968d2d3b684c\") " Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.377641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f9510b7-d3ab-4508-bf7e-968d2d3b684c" (UID: "9f9510b7-d3ab-4508-bf7e-968d2d3b684c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.381406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r" (OuterVolumeSpecName: "kube-api-access-ppb2r") pod "9f9510b7-d3ab-4508-bf7e-968d2d3b684c" (UID: "9f9510b7-d3ab-4508-bf7e-968d2d3b684c"). InnerVolumeSpecName "kube-api-access-ppb2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.478102 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:40 crc kubenswrapper[4909]: I1201 10:47:40.478138 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppb2r\" (UniqueName: \"kubernetes.io/projected/9f9510b7-d3ab-4508-bf7e-968d2d3b684c-kube-api-access-ppb2r\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:41 crc kubenswrapper[4909]: I1201 10:47:41.082738 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-def2-account-create-update-4n7lx" event={"ID":"9f9510b7-d3ab-4508-bf7e-968d2d3b684c","Type":"ContainerDied","Data":"ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba"} Dec 01 10:47:41 crc kubenswrapper[4909]: I1201 10:47:41.083274 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba745ab483d85ac5a8cd9a046d68306f9c4c3a82779424184fabe00455bb17ba" Dec 01 10:47:41 crc kubenswrapper[4909]: I1201 10:47:41.082804 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-def2-account-create-update-4n7lx" Dec 01 10:47:46 crc kubenswrapper[4909]: I1201 10:47:46.705522 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zcgpf" podUID="a2ab5fcc-f33d-4495-a7e4-c4305f3e846e" containerName="ovn-controller" probeResult="failure" output=< Dec 01 10:47:46 crc kubenswrapper[4909]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 10:47:46 crc kubenswrapper[4909]: > Dec 01 10:47:46 crc kubenswrapper[4909]: I1201 10:47:46.763755 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:47:46 crc kubenswrapper[4909]: I1201 10:47:46.770928 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z64hl" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.017569 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zcgpf-config-cnpww"] Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018741 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="init" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018769 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="init" Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018799 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2c338d-8319-457b-ad3b-cbf51df55d8a" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018810 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2c338d-8319-457b-ad3b-cbf51df55d8a" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f21bdf9-c58c-4b57-b3fe-f64338c83e32" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018842 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f21bdf9-c58c-4b57-b3fe-f64338c83e32" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018872 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9510b7-d3ab-4508-bf7e-968d2d3b684c" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018897 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9510b7-d3ab-4508-bf7e-968d2d3b684c" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018911 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33db1774-4b90-4cba-be63-744f8b79f29c" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018919 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="33db1774-4b90-4cba-be63-744f8b79f29c" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: E1201 10:47:47.018930 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="dnsmasq-dns" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.018939 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="dnsmasq-dns" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.019229 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ecb921-0356-4a1d-93dd-1fb91e41e081" containerName="dnsmasq-dns" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.019244 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9510b7-d3ab-4508-bf7e-968d2d3b684c" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.019263 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="33db1774-4b90-4cba-be63-744f8b79f29c" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.019277 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2c338d-8319-457b-ad3b-cbf51df55d8a" containerName="mariadb-database-create" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.019361 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f21bdf9-c58c-4b57-b3fe-f64338c83e32" containerName="mariadb-account-create-update" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.020041 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.023587 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.037954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.038393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.045356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.045500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpgdv\" (UniqueName: \"kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.045690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.045798 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.040648 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zcgpf-config-cnpww"] Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149115 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpgdv\" (UniqueName: \"kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.149667 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.151557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.152310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.152449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.157134 4909 generic.go:334] "Generic (PLEG): container finished" podID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerID="ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59" exitCode=0 Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.157205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerDied","Data":"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59"} Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.161594 4909 generic.go:334] "Generic (PLEG): container finished" podID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerID="decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a" exitCode=0 Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.161820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerDied","Data":"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a"} Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.198531 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpgdv\" (UniqueName: \"kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv\") pod \"ovn-controller-zcgpf-config-cnpww\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:47 crc kubenswrapper[4909]: I1201 10:47:47.339026 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:51 crc kubenswrapper[4909]: I1201 10:47:51.686892 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zcgpf" podUID="a2ab5fcc-f33d-4495-a7e4-c4305f3e846e" containerName="ovn-controller" probeResult="failure" output=< Dec 01 10:47:51 crc kubenswrapper[4909]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 10:47:51 crc kubenswrapper[4909]: > Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.139415 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zcgpf-config-cnpww"] Dec 01 10:47:52 crc kubenswrapper[4909]: W1201 10:47:52.152565 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc86f000_ece0_430b_8f4d_90e9e513c8db.slice/crio-3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba WatchSource:0}: Error finding container 3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba: Status 404 returned error can't find the container with id 3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.211748 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerStarted","Data":"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214"} Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.212750 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.214424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zcgpf-config-cnpww" event={"ID":"cc86f000-ece0-430b-8f4d-90e9e513c8db","Type":"ContainerStarted","Data":"3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba"} Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.218048 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerStarted","Data":"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2"} Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.218985 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.250969 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=64.272606851 podStartE2EDuration="1m10.250946227s" podCreationTimestamp="2025-12-01 10:46:42 +0000 UTC" firstStartedPulling="2025-12-01 10:47:02.210260462 +0000 UTC m=+939.444731360" lastFinishedPulling="2025-12-01 10:47:08.188599838 +0000 UTC m=+945.423070736" observedRunningTime="2025-12-01 10:47:52.244825918 +0000 UTC m=+989.479296856" watchObservedRunningTime="2025-12-01 10:47:52.250946227 +0000 UTC m=+989.485417115" Dec 01 10:47:52 crc kubenswrapper[4909]: I1201 10:47:52.284874 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=63.695188024 podStartE2EDuration="1m11.284848008s" podCreationTimestamp="2025-12-01 10:46:41 +0000 UTC" firstStartedPulling="2025-12-01 10:47:00.598912563 +0000 UTC m=+937.833383461" lastFinishedPulling="2025-12-01 10:47:08.188572547 +0000 UTC m=+945.423043445" observedRunningTime="2025-12-01 10:47:52.276512007 +0000 UTC m=+989.510982905" watchObservedRunningTime="2025-12-01 10:47:52.284848008 +0000 UTC m=+989.519318916" Dec 01 10:47:53 crc kubenswrapper[4909]: I1201 10:47:53.228376 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx8p2" event={"ID":"2e80149e-1959-4ae7-a8b3-41fc91f45121","Type":"ContainerStarted","Data":"c31a3a73c919ec3abf98051cc3c4a6f5527c01f74209a519b4f5e57054cc8109"} Dec 01 10:47:53 crc kubenswrapper[4909]: I1201 10:47:53.238261 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc86f000-ece0-430b-8f4d-90e9e513c8db" containerID="ed099a35ee29271cbf59128ca6a1e18c44d89d0d2fc53355c26f026f9c9ddf2f" exitCode=0 Dec 01 10:47:53 crc kubenswrapper[4909]: I1201 10:47:53.239226 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zcgpf-config-cnpww" event={"ID":"cc86f000-ece0-430b-8f4d-90e9e513c8db","Type":"ContainerDied","Data":"ed099a35ee29271cbf59128ca6a1e18c44d89d0d2fc53355c26f026f9c9ddf2f"} Dec 01 10:47:53 crc kubenswrapper[4909]: I1201 10:47:53.261507 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xx8p2" podStartSLOduration=2.708444477 podStartE2EDuration="16.26149018s" podCreationTimestamp="2025-12-01 10:47:37 +0000 UTC" firstStartedPulling="2025-12-01 10:47:38.178406669 +0000 UTC m=+975.412877567" lastFinishedPulling="2025-12-01 10:47:51.731452372 +0000 UTC m=+988.965923270" observedRunningTime="2025-12-01 10:47:53.249845712 +0000 UTC m=+990.484316610" watchObservedRunningTime="2025-12-01 10:47:53.26149018 +0000 UTC m=+990.495961078" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.573093 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.702812 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.702931 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.702967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.702983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.703002 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpgdv\" (UniqueName: \"kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.702996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run" (OuterVolumeSpecName: "var-run") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.703086 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.703069 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.703505 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts\") pod \"cc86f000-ece0-430b-8f4d-90e9e513c8db\" (UID: \"cc86f000-ece0-430b-8f4d-90e9e513c8db\") " Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts" (OuterVolumeSpecName: "scripts") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704500 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704521 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704530 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704540 4909 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cc86f000-ece0-430b-8f4d-90e9e513c8db-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.704553 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc86f000-ece0-430b-8f4d-90e9e513c8db-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.709500 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv" (OuterVolumeSpecName: "kube-api-access-kpgdv") pod "cc86f000-ece0-430b-8f4d-90e9e513c8db" (UID: "cc86f000-ece0-430b-8f4d-90e9e513c8db"). InnerVolumeSpecName "kube-api-access-kpgdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:54 crc kubenswrapper[4909]: I1201 10:47:54.806325 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpgdv\" (UniqueName: \"kubernetes.io/projected/cc86f000-ece0-430b-8f4d-90e9e513c8db-kube-api-access-kpgdv\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:55 crc kubenswrapper[4909]: I1201 10:47:55.253465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zcgpf-config-cnpww" event={"ID":"cc86f000-ece0-430b-8f4d-90e9e513c8db","Type":"ContainerDied","Data":"3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba"} Dec 01 10:47:55 crc kubenswrapper[4909]: I1201 10:47:55.253518 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zcgpf-config-cnpww" Dec 01 10:47:55 crc kubenswrapper[4909]: I1201 10:47:55.253524 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fef2e1dd921860f1913ae4d13a680e2c99d7ac08bee9e742a17807e3c61c0ba" Dec 01 10:47:55 crc kubenswrapper[4909]: I1201 10:47:55.702471 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zcgpf-config-cnpww"] Dec 01 10:47:55 crc kubenswrapper[4909]: I1201 10:47:55.711546 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zcgpf-config-cnpww"] Dec 01 10:47:56 crc kubenswrapper[4909]: I1201 10:47:56.691362 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zcgpf" Dec 01 10:47:57 crc kubenswrapper[4909]: I1201 10:47:57.266166 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc86f000-ece0-430b-8f4d-90e9e513c8db" path="/var/lib/kubelet/pods/cc86f000-ece0-430b-8f4d-90e9e513c8db/volumes" Dec 01 10:48:00 crc kubenswrapper[4909]: I1201 10:48:00.298445 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e80149e-1959-4ae7-a8b3-41fc91f45121" containerID="c31a3a73c919ec3abf98051cc3c4a6f5527c01f74209a519b4f5e57054cc8109" exitCode=0 Dec 01 10:48:00 crc kubenswrapper[4909]: I1201 10:48:00.298535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx8p2" event={"ID":"2e80149e-1959-4ae7-a8b3-41fc91f45121","Type":"ContainerDied","Data":"c31a3a73c919ec3abf98051cc3c4a6f5527c01f74209a519b4f5e57054cc8109"} Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.694751 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx8p2" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.827523 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data\") pod \"2e80149e-1959-4ae7-a8b3-41fc91f45121\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.827928 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle\") pod \"2e80149e-1959-4ae7-a8b3-41fc91f45121\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.828265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data\") pod \"2e80149e-1959-4ae7-a8b3-41fc91f45121\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.828424 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz5rr\" (UniqueName: \"kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr\") pod \"2e80149e-1959-4ae7-a8b3-41fc91f45121\" (UID: \"2e80149e-1959-4ae7-a8b3-41fc91f45121\") " Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.833199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2e80149e-1959-4ae7-a8b3-41fc91f45121" (UID: "2e80149e-1959-4ae7-a8b3-41fc91f45121"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.837230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr" (OuterVolumeSpecName: "kube-api-access-vz5rr") pod "2e80149e-1959-4ae7-a8b3-41fc91f45121" (UID: "2e80149e-1959-4ae7-a8b3-41fc91f45121"). InnerVolumeSpecName "kube-api-access-vz5rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.852221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e80149e-1959-4ae7-a8b3-41fc91f45121" (UID: "2e80149e-1959-4ae7-a8b3-41fc91f45121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.872084 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data" (OuterVolumeSpecName: "config-data") pod "2e80149e-1959-4ae7-a8b3-41fc91f45121" (UID: "2e80149e-1959-4ae7-a8b3-41fc91f45121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.930334 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz5rr\" (UniqueName: \"kubernetes.io/projected/2e80149e-1959-4ae7-a8b3-41fc91f45121-kube-api-access-vz5rr\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.930379 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.930391 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:01 crc kubenswrapper[4909]: I1201 10:48:01.930401 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e80149e-1959-4ae7-a8b3-41fc91f45121-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.315768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx8p2" event={"ID":"2e80149e-1959-4ae7-a8b3-41fc91f45121","Type":"ContainerDied","Data":"7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c"} Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.316130 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e64c47a9d998771c558e84daf7f3560210edfe73bea57635b2f07144d54ea0c" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.315839 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx8p2" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.822632 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:02 crc kubenswrapper[4909]: E1201 10:48:02.823307 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e80149e-1959-4ae7-a8b3-41fc91f45121" containerName="glance-db-sync" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.823323 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e80149e-1959-4ae7-a8b3-41fc91f45121" containerName="glance-db-sync" Dec 01 10:48:02 crc kubenswrapper[4909]: E1201 10:48:02.823342 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc86f000-ece0-430b-8f4d-90e9e513c8db" containerName="ovn-config" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.823351 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc86f000-ece0-430b-8f4d-90e9e513c8db" containerName="ovn-config" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.823685 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e80149e-1959-4ae7-a8b3-41fc91f45121" containerName="glance-db-sync" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.823732 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc86f000-ece0-430b-8f4d-90e9e513c8db" containerName="ovn-config" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.829659 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.858837 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.947292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.947398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.947439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkql\" (UniqueName: \"kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.947486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:02 crc kubenswrapper[4909]: I1201 10:48:02.947561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.049866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.049982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.050068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgkql\" (UniqueName: \"kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.050121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.050189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.051319 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.051321 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.051508 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.052014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.067076 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgkql\" (UniqueName: \"kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql\") pod \"dnsmasq-dns-54f9b7b8d9-xgjwv\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.133706 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.155243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.536737 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.609936 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tb8bv"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.611180 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.640129 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tb8bv"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.688998 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8r6cb"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.691382 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.730512 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-891a-account-create-update-dpwsn"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.731771 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.748717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.752430 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-891a-account-create-update-dpwsn"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.764957 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8r6cb"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.766424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.766498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.870343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.870720 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.870825 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6fz\" (UniqueName: \"kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.870864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.870992 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvk8g\" (UniqueName: \"kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.871023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.871618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.890438 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-z4t9q"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.891899 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.900379 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z4t9q"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.910728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9\") pod \"cinder-db-create-tb8bv\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.943160 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.967445 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ffhtn"] Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.969038 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.973473 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fpd6f" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.973836 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.974026 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.974182 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8wd\" (UniqueName: \"kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979314 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtrhm\" (UniqueName: \"kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979373 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6fz\" (UniqueName: \"kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979481 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvk8g\" (UniqueName: \"kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.979540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.980182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.980808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:03 crc kubenswrapper[4909]: I1201 10:48:03.999389 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ffhtn"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.007293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6fz\" (UniqueName: \"kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz\") pod \"barbican-db-create-8r6cb\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.011983 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvk8g\" (UniqueName: \"kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g\") pod \"cinder-891a-account-create-update-dpwsn\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.019503 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3f20-account-create-update-xfjjq"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.020677 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.024310 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.039932 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.050560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3f20-account-create-update-xfjjq"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.065439 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8wd\" (UniqueName: \"kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtrhm\" (UniqueName: \"kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvm4h\" (UniqueName: \"kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.086740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.090398 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.090838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.111317 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.112798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: W1201 10:48:04.126618 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77987e39_1fa6_44c9_9f33_c066775eae5a.slice/crio-290efa853a44a1527bcb1f03fa3c2196e07e1855ec657e9b61a426ada3af305b WatchSource:0}: Error finding container 290efa853a44a1527bcb1f03fa3c2196e07e1855ec657e9b61a426ada3af305b: Status 404 returned error can't find the container with id 290efa853a44a1527bcb1f03fa3c2196e07e1855ec657e9b61a426ada3af305b Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.127470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8wd\" (UniqueName: \"kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd\") pod \"keystone-db-sync-ffhtn\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.144251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtrhm\" (UniqueName: \"kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm\") pod \"neutron-db-create-z4t9q\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.188364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.188524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvm4h\" (UniqueName: \"kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.190251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.199618 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cbeb-account-create-update-8lttg"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.227716 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cbeb-account-create-update-8lttg"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.227829 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.232647 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.239007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvm4h\" (UniqueName: \"kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h\") pod \"barbican-3f20-account-create-update-xfjjq\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.242089 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.358737 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.364174 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.379143 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerStarted","Data":"290efa853a44a1527bcb1f03fa3c2196e07e1855ec657e9b61a426ada3af305b"} Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.396520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.396616 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mjnw\" (UniqueName: \"kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.497653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.498218 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mjnw\" (UniqueName: \"kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.499125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.523987 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mjnw\" (UniqueName: \"kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw\") pod \"neutron-cbeb-account-create-update-8lttg\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.612406 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.733023 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tb8bv"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.833967 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-891a-account-create-update-dpwsn"] Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.944926 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z4t9q"] Dec 01 10:48:04 crc kubenswrapper[4909]: W1201 10:48:04.957835 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ae5889f_72cd_4333_9095_93bf13bcdc14.slice/crio-7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34 WatchSource:0}: Error finding container 7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34: Status 404 returned error can't find the container with id 7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34 Dec 01 10:48:04 crc kubenswrapper[4909]: I1201 10:48:04.962323 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8r6cb"] Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.004692 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cbeb-account-create-update-8lttg"] Dec 01 10:48:05 crc kubenswrapper[4909]: W1201 10:48:05.014748 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a51be9c_9fb0_4532_8b3d_1f3d1adc7d36.slice/crio-2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978 WatchSource:0}: Error finding container 2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978: Status 404 returned error can't find the container with id 2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978 Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.072463 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3f20-account-create-update-xfjjq"] Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.109478 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ffhtn"] Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.394722 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z4t9q" event={"ID":"5ae5889f-72cd-4333-9095-93bf13bcdc14","Type":"ContainerStarted","Data":"e02bfc85d62a759463f53a75b25bb74cfaf679f7e4b51d3ee20476be4578288d"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.395331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z4t9q" event={"ID":"5ae5889f-72cd-4333-9095-93bf13bcdc14","Type":"ContainerStarted","Data":"7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.400829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891a-account-create-update-dpwsn" event={"ID":"e50ed289-5809-492b-a95a-895da6ba0c76","Type":"ContainerStarted","Data":"1eb1437d42668fff56e46dc63ac5a12147734fde7aacbb674871a966e6b5541c"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.400907 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891a-account-create-update-dpwsn" event={"ID":"e50ed289-5809-492b-a95a-895da6ba0c76","Type":"ContainerStarted","Data":"669b60b1806b9b5550913e6c8e2567fc8ab73e4420cd9623575d569cfe117063"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.403151 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbeb-account-create-update-8lttg" event={"ID":"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36","Type":"ContainerStarted","Data":"2c9a8a3316e52561aa182934f04e3406c535bac7962bc6215d33f1aae52b2842"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.403189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbeb-account-create-update-8lttg" event={"ID":"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36","Type":"ContainerStarted","Data":"2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.406424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8r6cb" event={"ID":"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e","Type":"ContainerStarted","Data":"a9fbadf42bb43c54520be45792690a305f564b69fbace35a4cc7701ebc44182f"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.406455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8r6cb" event={"ID":"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e","Type":"ContainerStarted","Data":"e14e0b6cc054487219e0f62093320049247c78307c2c23efee1cb9d4a6ee2788"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.414937 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-z4t9q" podStartSLOduration=2.41490878 podStartE2EDuration="2.41490878s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.40969279 +0000 UTC m=+1002.644163708" watchObservedRunningTime="2025-12-01 10:48:05.41490878 +0000 UTC m=+1002.649379678" Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.415724 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerStarted","Data":"9a72c4e1c1c17f9dd6950b27c0afc4b821b7d48906714405808f12e577ae636d"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.425980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3f20-account-create-update-xfjjq" event={"ID":"5d4cdf99-e33d-4314-ba4d-3cad05e58712","Type":"ContainerStarted","Data":"d84d45e370457c4913ada2e3cffde5a5fc7aa63070a81d6561afb65c04df13a6"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.426034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3f20-account-create-update-xfjjq" event={"ID":"5d4cdf99-e33d-4314-ba4d-3cad05e58712","Type":"ContainerStarted","Data":"e9babaa2089f715de7e574d1ab563c8eb40f2a5da321ce8c8e3560ea50f04b47"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.427911 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ffhtn" event={"ID":"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b","Type":"ContainerStarted","Data":"9663fd8a78cf37e984eb1201d7b76491bedaa8c976fcadd6f85b8272b9bf939b"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.431863 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cbeb-account-create-update-8lttg" podStartSLOduration=1.431841589 podStartE2EDuration="1.431841589s" podCreationTimestamp="2025-12-01 10:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.430301889 +0000 UTC m=+1002.664772807" watchObservedRunningTime="2025-12-01 10:48:05.431841589 +0000 UTC m=+1002.666312487" Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.433894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tb8bv" event={"ID":"76721a6a-527e-4c51-8fbb-5a63dfb515a0","Type":"ContainerStarted","Data":"ba3e371b99cd19fd004a535768c38eb3df40f1a5f5e3a546cdeab6fcfbebe989"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.433952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tb8bv" event={"ID":"76721a6a-527e-4c51-8fbb-5a63dfb515a0","Type":"ContainerStarted","Data":"046aaebcc9367ab0ae26388635242f91840924cac702f698c3ecc9b59e710c37"} Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.477112 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-891a-account-create-update-dpwsn" podStartSLOduration=2.477082769 podStartE2EDuration="2.477082769s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.453033868 +0000 UTC m=+1002.687504766" watchObservedRunningTime="2025-12-01 10:48:05.477082769 +0000 UTC m=+1002.711553667" Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.492692 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8r6cb" podStartSLOduration=2.492668435 podStartE2EDuration="2.492668435s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.48112764 +0000 UTC m=+1002.715598558" watchObservedRunningTime="2025-12-01 10:48:05.492668435 +0000 UTC m=+1002.727139333" Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.512327 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-tb8bv" podStartSLOduration=2.512300803 podStartE2EDuration="2.512300803s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.497835083 +0000 UTC m=+1002.732305981" watchObservedRunningTime="2025-12-01 10:48:05.512300803 +0000 UTC m=+1002.746771711" Dec 01 10:48:05 crc kubenswrapper[4909]: I1201 10:48:05.525588 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-3f20-account-create-update-xfjjq" podStartSLOduration=2.525565974 podStartE2EDuration="2.525565974s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:05.521494811 +0000 UTC m=+1002.755965709" watchObservedRunningTime="2025-12-01 10:48:05.525565974 +0000 UTC m=+1002.760036872" Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.456736 4909 generic.go:334] "Generic (PLEG): container finished" podID="76721a6a-527e-4c51-8fbb-5a63dfb515a0" containerID="ba3e371b99cd19fd004a535768c38eb3df40f1a5f5e3a546cdeab6fcfbebe989" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.457896 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tb8bv" event={"ID":"76721a6a-527e-4c51-8fbb-5a63dfb515a0","Type":"ContainerDied","Data":"ba3e371b99cd19fd004a535768c38eb3df40f1a5f5e3a546cdeab6fcfbebe989"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.461289 4909 generic.go:334] "Generic (PLEG): container finished" podID="5ae5889f-72cd-4333-9095-93bf13bcdc14" containerID="e02bfc85d62a759463f53a75b25bb74cfaf679f7e4b51d3ee20476be4578288d" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.461508 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z4t9q" event={"ID":"5ae5889f-72cd-4333-9095-93bf13bcdc14","Type":"ContainerDied","Data":"e02bfc85d62a759463f53a75b25bb74cfaf679f7e4b51d3ee20476be4578288d"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.475975 4909 generic.go:334] "Generic (PLEG): container finished" podID="e50ed289-5809-492b-a95a-895da6ba0c76" containerID="1eb1437d42668fff56e46dc63ac5a12147734fde7aacbb674871a966e6b5541c" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.476062 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891a-account-create-update-dpwsn" event={"ID":"e50ed289-5809-492b-a95a-895da6ba0c76","Type":"ContainerDied","Data":"1eb1437d42668fff56e46dc63ac5a12147734fde7aacbb674871a966e6b5541c"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.487957 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" containerID="2c9a8a3316e52561aa182934f04e3406c535bac7962bc6215d33f1aae52b2842" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.488051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbeb-account-create-update-8lttg" event={"ID":"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36","Type":"ContainerDied","Data":"2c9a8a3316e52561aa182934f04e3406c535bac7962bc6215d33f1aae52b2842"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.496516 4909 generic.go:334] "Generic (PLEG): container finished" podID="96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" containerID="a9fbadf42bb43c54520be45792690a305f564b69fbace35a4cc7701ebc44182f" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.496649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8r6cb" event={"ID":"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e","Type":"ContainerDied","Data":"a9fbadf42bb43c54520be45792690a305f564b69fbace35a4cc7701ebc44182f"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.526741 4909 generic.go:334] "Generic (PLEG): container finished" podID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerID="9a72c4e1c1c17f9dd6950b27c0afc4b821b7d48906714405808f12e577ae636d" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.526857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerDied","Data":"9a72c4e1c1c17f9dd6950b27c0afc4b821b7d48906714405808f12e577ae636d"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.526912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerStarted","Data":"7b493675055357c98861d160bf70397a119d36ec2c499b273a3f437bab2adde7"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.527181 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.539389 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d4cdf99-e33d-4314-ba4d-3cad05e58712" containerID="d84d45e370457c4913ada2e3cffde5a5fc7aa63070a81d6561afb65c04df13a6" exitCode=0 Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.539468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3f20-account-create-update-xfjjq" event={"ID":"5d4cdf99-e33d-4314-ba4d-3cad05e58712","Type":"ContainerDied","Data":"d84d45e370457c4913ada2e3cffde5a5fc7aa63070a81d6561afb65c04df13a6"} Dec 01 10:48:06 crc kubenswrapper[4909]: I1201 10:48:06.632586 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" podStartSLOduration=4.63191487 podStartE2EDuration="4.63191487s" podCreationTimestamp="2025-12-01 10:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:06.627702492 +0000 UTC m=+1003.862173420" watchObservedRunningTime="2025-12-01 10:48:06.63191487 +0000 UTC m=+1003.866385758" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.347343 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.354685 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.394002 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.402613 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.409968 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.419751 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446232 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts\") pod \"5ae5889f-72cd-4333-9095-93bf13bcdc14\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446289 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts\") pod \"e50ed289-5809-492b-a95a-895da6ba0c76\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446359 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts\") pod \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446430 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts\") pod \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtrhm\" (UniqueName: \"kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm\") pod \"5ae5889f-72cd-4333-9095-93bf13bcdc14\" (UID: \"5ae5889f-72cd-4333-9095-93bf13bcdc14\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446498 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvk8g\" (UniqueName: \"kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g\") pod \"e50ed289-5809-492b-a95a-895da6ba0c76\" (UID: \"e50ed289-5809-492b-a95a-895da6ba0c76\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446543 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp6fz\" (UniqueName: \"kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz\") pod \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9\") pod \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\" (UID: \"76721a6a-527e-4c51-8fbb-5a63dfb515a0\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mjnw\" (UniqueName: \"kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw\") pod \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\" (UID: \"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446644 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts\") pod \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\" (UID: \"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446665 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts\") pod \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.446723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvm4h\" (UniqueName: \"kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h\") pod \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\" (UID: \"5d4cdf99-e33d-4314-ba4d-3cad05e58712\") " Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.451431 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76721a6a-527e-4c51-8fbb-5a63dfb515a0" (UID: "76721a6a-527e-4c51-8fbb-5a63dfb515a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.451960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ae5889f-72cd-4333-9095-93bf13bcdc14" (UID: "5ae5889f-72cd-4333-9095-93bf13bcdc14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.452858 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e50ed289-5809-492b-a95a-895da6ba0c76" (UID: "e50ed289-5809-492b-a95a-895da6ba0c76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.453204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" (UID: "9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.459206 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d4cdf99-e33d-4314-ba4d-3cad05e58712" (UID: "5d4cdf99-e33d-4314-ba4d-3cad05e58712"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.459431 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" (UID: "96ffe604-13b8-4b0e-ba75-ed64c19a9b8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.490498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h" (OuterVolumeSpecName: "kube-api-access-mvm4h") pod "5d4cdf99-e33d-4314-ba4d-3cad05e58712" (UID: "5d4cdf99-e33d-4314-ba4d-3cad05e58712"). InnerVolumeSpecName "kube-api-access-mvm4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.490774 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm" (OuterVolumeSpecName: "kube-api-access-vtrhm") pod "5ae5889f-72cd-4333-9095-93bf13bcdc14" (UID: "5ae5889f-72cd-4333-9095-93bf13bcdc14"). InnerVolumeSpecName "kube-api-access-vtrhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.491462 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz" (OuterVolumeSpecName: "kube-api-access-hp6fz") pod "96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" (UID: "96ffe604-13b8-4b0e-ba75-ed64c19a9b8e"). InnerVolumeSpecName "kube-api-access-hp6fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.493220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g" (OuterVolumeSpecName: "kube-api-access-jvk8g") pod "e50ed289-5809-492b-a95a-895da6ba0c76" (UID: "e50ed289-5809-492b-a95a-895da6ba0c76"). InnerVolumeSpecName "kube-api-access-jvk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.497051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw" (OuterVolumeSpecName: "kube-api-access-6mjnw") pod "9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" (UID: "9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36"). InnerVolumeSpecName "kube-api-access-6mjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.497692 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9" (OuterVolumeSpecName: "kube-api-access-l9vq9") pod "76721a6a-527e-4c51-8fbb-5a63dfb515a0" (UID: "76721a6a-527e-4c51-8fbb-5a63dfb515a0"). InnerVolumeSpecName "kube-api-access-l9vq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549024 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvm4h\" (UniqueName: \"kubernetes.io/projected/5d4cdf99-e33d-4314-ba4d-3cad05e58712-kube-api-access-mvm4h\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549073 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae5889f-72cd-4333-9095-93bf13bcdc14-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549084 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50ed289-5809-492b-a95a-895da6ba0c76-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549097 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76721a6a-527e-4c51-8fbb-5a63dfb515a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549108 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549117 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtrhm\" (UniqueName: \"kubernetes.io/projected/5ae5889f-72cd-4333-9095-93bf13bcdc14-kube-api-access-vtrhm\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549131 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvk8g\" (UniqueName: \"kubernetes.io/projected/e50ed289-5809-492b-a95a-895da6ba0c76-kube-api-access-jvk8g\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549142 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp6fz\" (UniqueName: \"kubernetes.io/projected/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-kube-api-access-hp6fz\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549152 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/76721a6a-527e-4c51-8fbb-5a63dfb515a0-kube-api-access-l9vq9\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549163 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mjnw\" (UniqueName: \"kubernetes.io/projected/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36-kube-api-access-6mjnw\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549174 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.549184 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d4cdf99-e33d-4314-ba4d-3cad05e58712-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.594679 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891a-account-create-update-dpwsn" event={"ID":"e50ed289-5809-492b-a95a-895da6ba0c76","Type":"ContainerDied","Data":"669b60b1806b9b5550913e6c8e2567fc8ab73e4420cd9623575d569cfe117063"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.594752 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="669b60b1806b9b5550913e6c8e2567fc8ab73e4420cd9623575d569cfe117063" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.594858 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891a-account-create-update-dpwsn" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.603943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbeb-account-create-update-8lttg" event={"ID":"9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36","Type":"ContainerDied","Data":"2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.603992 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b31123250b21141cc9595093614be93b4dc293e1791f9f58393878ee1d0e978" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.604058 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbeb-account-create-update-8lttg" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.608905 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8r6cb" event={"ID":"96ffe604-13b8-4b0e-ba75-ed64c19a9b8e","Type":"ContainerDied","Data":"e14e0b6cc054487219e0f62093320049247c78307c2c23efee1cb9d4a6ee2788"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.608975 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e14e0b6cc054487219e0f62093320049247c78307c2c23efee1cb9d4a6ee2788" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.609055 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8r6cb" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.610964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3f20-account-create-update-xfjjq" event={"ID":"5d4cdf99-e33d-4314-ba4d-3cad05e58712","Type":"ContainerDied","Data":"e9babaa2089f715de7e574d1ab563c8eb40f2a5da321ce8c8e3560ea50f04b47"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.610996 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9babaa2089f715de7e574d1ab563c8eb40f2a5da321ce8c8e3560ea50f04b47" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.611146 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3f20-account-create-update-xfjjq" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.612611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ffhtn" event={"ID":"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b","Type":"ContainerStarted","Data":"061e842010ad862f56cb5f42076ec711792d6754d9348913fbd62ad54e101f0b"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.614153 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tb8bv" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.614141 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tb8bv" event={"ID":"76721a6a-527e-4c51-8fbb-5a63dfb515a0","Type":"ContainerDied","Data":"046aaebcc9367ab0ae26388635242f91840924cac702f698c3ecc9b59e710c37"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.614202 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046aaebcc9367ab0ae26388635242f91840924cac702f698c3ecc9b59e710c37" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.615237 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z4t9q" event={"ID":"5ae5889f-72cd-4333-9095-93bf13bcdc14","Type":"ContainerDied","Data":"7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34"} Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.615363 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7998faa67c141dcdbd0c2446fc15b3173400b156e9fd6fc0ec8c0996373f4c34" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.615432 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z4t9q" Dec 01 10:48:10 crc kubenswrapper[4909]: I1201 10:48:10.646534 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ffhtn" podStartSLOduration=2.590855645 podStartE2EDuration="7.646502949s" podCreationTimestamp="2025-12-01 10:48:03 +0000 UTC" firstStartedPulling="2025-12-01 10:48:05.152373052 +0000 UTC m=+1002.386843950" lastFinishedPulling="2025-12-01 10:48:10.208020356 +0000 UTC m=+1007.442491254" observedRunningTime="2025-12-01 10:48:10.641096613 +0000 UTC m=+1007.875567531" watchObservedRunningTime="2025-12-01 10:48:10.646502949 +0000 UTC m=+1007.880973857" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.157081 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.245004 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.245319 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="dnsmasq-dns" containerID="cri-o://beeec8123cfad4adc9e251a33355dd05a7a2192e83ad3f4b7a376ed36474ce5c" gracePeriod=10 Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.644432 4909 generic.go:334] "Generic (PLEG): container finished" podID="0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" containerID="061e842010ad862f56cb5f42076ec711792d6754d9348913fbd62ad54e101f0b" exitCode=0 Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.644507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ffhtn" event={"ID":"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b","Type":"ContainerDied","Data":"061e842010ad862f56cb5f42076ec711792d6754d9348913fbd62ad54e101f0b"} Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.647510 4909 generic.go:334] "Generic (PLEG): container finished" podID="ba361527-47be-4c33-a659-f90cdabb757c" containerID="beeec8123cfad4adc9e251a33355dd05a7a2192e83ad3f4b7a376ed36474ce5c" exitCode=0 Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.647545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerDied","Data":"beeec8123cfad4adc9e251a33355dd05a7a2192e83ad3f4b7a376ed36474ce5c"} Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.647566 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" event={"ID":"ba361527-47be-4c33-a659-f90cdabb757c","Type":"ContainerDied","Data":"362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e"} Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.647576 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="362e397e585607a3d595d830d68c30abac41f5e852372b774885f9a22d032a5e" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.720863 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.806714 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config\") pod \"ba361527-47be-4c33-a659-f90cdabb757c\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.806840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc\") pod \"ba361527-47be-4c33-a659-f90cdabb757c\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.806943 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb\") pod \"ba361527-47be-4c33-a659-f90cdabb757c\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.807023 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd2n2\" (UniqueName: \"kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2\") pod \"ba361527-47be-4c33-a659-f90cdabb757c\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.807075 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb\") pod \"ba361527-47be-4c33-a659-f90cdabb757c\" (UID: \"ba361527-47be-4c33-a659-f90cdabb757c\") " Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.830126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2" (OuterVolumeSpecName: "kube-api-access-sd2n2") pod "ba361527-47be-4c33-a659-f90cdabb757c" (UID: "ba361527-47be-4c33-a659-f90cdabb757c"). InnerVolumeSpecName "kube-api-access-sd2n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.875549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba361527-47be-4c33-a659-f90cdabb757c" (UID: "ba361527-47be-4c33-a659-f90cdabb757c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.877145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba361527-47be-4c33-a659-f90cdabb757c" (UID: "ba361527-47be-4c33-a659-f90cdabb757c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.891377 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config" (OuterVolumeSpecName: "config") pod "ba361527-47be-4c33-a659-f90cdabb757c" (UID: "ba361527-47be-4c33-a659-f90cdabb757c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.893603 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba361527-47be-4c33-a659-f90cdabb757c" (UID: "ba361527-47be-4c33-a659-f90cdabb757c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.909710 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd2n2\" (UniqueName: \"kubernetes.io/projected/ba361527-47be-4c33-a659-f90cdabb757c-kube-api-access-sd2n2\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.909774 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.909789 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.909805 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:13 crc kubenswrapper[4909]: I1201 10:48:13.909818 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba361527-47be-4c33-a659-f90cdabb757c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:14 crc kubenswrapper[4909]: I1201 10:48:14.655459 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7vlz2" Dec 01 10:48:14 crc kubenswrapper[4909]: I1201 10:48:14.694585 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:48:14 crc kubenswrapper[4909]: I1201 10:48:14.703968 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7vlz2"] Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.071629 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.134375 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle\") pod \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.134531 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8wd\" (UniqueName: \"kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd\") pod \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.134575 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data\") pod \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\" (UID: \"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b\") " Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.147892 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd" (OuterVolumeSpecName: "kube-api-access-7j8wd") pod "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" (UID: "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b"). InnerVolumeSpecName "kube-api-access-7j8wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.162687 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" (UID: "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.185291 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data" (OuterVolumeSpecName: "config-data") pod "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" (UID: "0182ce0e-1710-4fdb-b7cd-6143b66d6c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.237788 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8wd\" (UniqueName: \"kubernetes.io/projected/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-kube-api-access-7j8wd\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.237905 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.237923 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.267683 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba361527-47be-4c33-a659-f90cdabb757c" path="/var/lib/kubelet/pods/ba361527-47be-4c33-a659-f90cdabb757c/volumes" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.664606 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ffhtn" event={"ID":"0182ce0e-1710-4fdb-b7cd-6143b66d6c0b","Type":"ContainerDied","Data":"9663fd8a78cf37e984eb1201d7b76491bedaa8c976fcadd6f85b8272b9bf939b"} Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.664682 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9663fd8a78cf37e984eb1201d7b76491bedaa8c976fcadd6f85b8272b9bf939b" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.664756 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ffhtn" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.853366 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.853902 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76721a6a-527e-4c51-8fbb-5a63dfb515a0" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.853925 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="76721a6a-527e-4c51-8fbb-5a63dfb515a0" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.853946 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.853955 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.853968 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4cdf99-e33d-4314-ba4d-3cad05e58712" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.853976 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4cdf99-e33d-4314-ba4d-3cad05e58712" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.853990 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" containerName="keystone-db-sync" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.853999 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" containerName="keystone-db-sync" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.854017 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="init" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854025 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="init" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.854041 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="dnsmasq-dns" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854048 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="dnsmasq-dns" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.854061 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae5889f-72cd-4333-9095-93bf13bcdc14" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854068 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae5889f-72cd-4333-9095-93bf13bcdc14" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.854086 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854093 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: E1201 10:48:15.854112 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50ed289-5809-492b-a95a-895da6ba0c76" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854121 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50ed289-5809-492b-a95a-895da6ba0c76" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854325 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50ed289-5809-492b-a95a-895da6ba0c76" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854352 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" containerName="keystone-db-sync" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854360 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="76721a6a-527e-4c51-8fbb-5a63dfb515a0" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854370 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854381 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba361527-47be-4c33-a659-f90cdabb757c" containerName="dnsmasq-dns" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854397 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4cdf99-e33d-4314-ba4d-3cad05e58712" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854407 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae5889f-72cd-4333-9095-93bf13bcdc14" containerName="mariadb-database-create" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.854419 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" containerName="mariadb-account-create-update" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.855646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.871714 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.925785 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tr79n"] Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.927369 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.937383 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.937403 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.937828 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fpd6f" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.938060 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.938270 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.952756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tr79n"] Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963351 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963544 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963587 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963614 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nbh\" (UniqueName: \"kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:15 crc kubenswrapper[4909]: I1201 10:48:15.963985 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8m4\" (UniqueName: \"kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066197 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nbh\" (UniqueName: \"kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8m4\" (UniqueName: \"kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066410 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.066496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.067576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.068590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.070902 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.071644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.074598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.078232 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.083169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.099274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.102614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.105319 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nbh\" (UniqueName: \"kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh\") pod \"dnsmasq-dns-6546db6db7-x6zzh\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.110563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8m4\" (UniqueName: \"kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4\") pod \"keystone-bootstrap-tr79n\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.150605 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6nt8w"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.161854 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.167560 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mgtq4" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.167843 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.168170 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170351 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh9sz\" (UniqueName: \"kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170605 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170724 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.170852 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.175025 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6nt8w"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.189457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.214969 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.217364 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.231195 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.236313 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.274827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276716 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276824 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9sz\" (UniqueName: \"kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276885 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dgz\" (UniqueName: \"kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.276976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.277005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.277040 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.277062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.278339 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.291764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.294016 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.305215 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.316016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.325693 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh9sz\" (UniqueName: \"kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.357405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts\") pod \"cinder-db-sync-6nt8w\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.358661 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qqfxj"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.379642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382060 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382116 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382140 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dgz\" (UniqueName: \"kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382239 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.382254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.383787 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.393173 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-46hkv" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.394150 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.394372 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.394795 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqfxj"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.396056 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.396351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.419791 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2jwcr"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.420165 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.421334 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.426046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.431236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.431447 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.431724 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gvtb" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.431742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.443793 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dgz\" (UniqueName: \"kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz\") pod \"ceilometer-0\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.477551 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2jwcr"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.485401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.485460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkgfp\" (UniqueName: \"kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.485522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.490723 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.496302 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-975kr"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.497609 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.500813 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.501255 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dt6l6" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.518360 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.537499 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-975kr"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.551962 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.554658 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.560844 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.587681 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckjp\" (UniqueName: \"kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.588111 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.588190 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.588210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkgfp\" (UniqueName: \"kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.591156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.591271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.591300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.591337 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.595984 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.602730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.616772 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.618228 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkgfp\" (UniqueName: \"kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp\") pod \"neutron-db-sync-qqfxj\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694198 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694259 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckjp\" (UniqueName: \"kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6nd\" (UniqueName: \"kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694555 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694578 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46knm\" (UniqueName: \"kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.694623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.695792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.703170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.707667 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.709764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.726358 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckjp\" (UniqueName: \"kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp\") pod \"placement-db-sync-2jwcr\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6nd\" (UniqueName: \"kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802824 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.802909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46knm\" (UniqueName: \"kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.803291 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.807919 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.810768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.811481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.814309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.814736 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.815071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.823101 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.834176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6nd\" (UniqueName: \"kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.843057 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46knm\" (UniqueName: \"kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm\") pod \"dnsmasq-dns-7987f74bbc-xqrbt\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.846907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle\") pod \"barbican-db-sync-975kr\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.904912 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.937025 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-975kr" Dec 01 10:48:16 crc kubenswrapper[4909]: I1201 10:48:16.948500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.024134 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tr79n"] Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.127011 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:17 crc kubenswrapper[4909]: W1201 10:48:17.152926 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfaee276_d6bb_4ac1_9c18_556e8a7f4e3a.slice/crio-f3cda89f3eaa826e93386c43ea279d9950dd52da5d500e7ab98308ba8b8379b5 WatchSource:0}: Error finding container f3cda89f3eaa826e93386c43ea279d9950dd52da5d500e7ab98308ba8b8379b5: Status 404 returned error can't find the container with id f3cda89f3eaa826e93386c43ea279d9950dd52da5d500e7ab98308ba8b8379b5 Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.199811 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6nt8w"] Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.223144 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.607527 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2jwcr"] Dec 01 10:48:17 crc kubenswrapper[4909]: W1201 10:48:17.607547 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode36ea017_d935_4011_8590_0aaa002229de.slice/crio-02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705 WatchSource:0}: Error finding container 02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705: Status 404 returned error can't find the container with id 02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705 Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.620980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qqfxj"] Dec 01 10:48:17 crc kubenswrapper[4909]: W1201 10:48:17.626344 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod672cc9fe_2684_446f_8c16_59c03ee23678.slice/crio-899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3 WatchSource:0}: Error finding container 899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3: Status 404 returned error can't find the container with id 899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3 Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.688335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerStarted","Data":"c39ee7cbe3fe062ae4d5bd3ed31049098069a5f8dc73ca86336a70bd9962b1ae"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.692641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tr79n" event={"ID":"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d","Type":"ContainerStarted","Data":"c8df12725e92ea8b98a39c1775644a76f065b526f47c544c9f7ac33438dfadc4"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.692728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tr79n" event={"ID":"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d","Type":"ContainerStarted","Data":"e18dbbd774b9dd93676fa7317e55d5b062f015cf7bb02191425df43c0681f6c3"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.694544 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6nt8w" event={"ID":"00cc5cc5-c22b-4523-96c5-baa507ad0ce1","Type":"ContainerStarted","Data":"d2188bca7aad3341e2f12fa446084138f4da332dbc390011c74225c568d86a68"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.702439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2jwcr" event={"ID":"e36ea017-d935-4011-8590-0aaa002229de","Type":"ContainerStarted","Data":"02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.705560 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqfxj" event={"ID":"672cc9fe-2684-446f-8c16-59c03ee23678","Type":"ContainerStarted","Data":"899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.714799 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tr79n" podStartSLOduration=2.714777746 podStartE2EDuration="2.714777746s" podCreationTimestamp="2025-12-01 10:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:17.712049957 +0000 UTC m=+1014.946520855" watchObservedRunningTime="2025-12-01 10:48:17.714777746 +0000 UTC m=+1014.949248644" Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.722865 4909 generic.go:334] "Generic (PLEG): container finished" podID="bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" containerID="cce80bfa1b8e1aedac1f45906962b59f5d6bf3ebfaef00feb2b7c1145e14460a" exitCode=0 Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.722969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" event={"ID":"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a","Type":"ContainerDied","Data":"cce80bfa1b8e1aedac1f45906962b59f5d6bf3ebfaef00feb2b7c1145e14460a"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.723015 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" event={"ID":"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a","Type":"ContainerStarted","Data":"f3cda89f3eaa826e93386c43ea279d9950dd52da5d500e7ab98308ba8b8379b5"} Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.787379 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-975kr"] Dec 01 10:48:17 crc kubenswrapper[4909]: W1201 10:48:17.805485 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc067d444_fbd3_4601_bc51_29f9d7c25afb.slice/crio-f903c21bae145312d107fc419b65777478a8ee820b6e0d0f5c20cb311755c691 WatchSource:0}: Error finding container f903c21bae145312d107fc419b65777478a8ee820b6e0d0f5c20cb311755c691: Status 404 returned error can't find the container with id f903c21bae145312d107fc419b65777478a8ee820b6e0d0f5c20cb311755c691 Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.812030 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:17 crc kubenswrapper[4909]: I1201 10:48:17.938014 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.200441 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.350703 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb\") pod \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.350757 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb\") pod \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.350789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc\") pod \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.351098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9nbh\" (UniqueName: \"kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh\") pod \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.351162 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config\") pod \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\" (UID: \"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a\") " Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.368889 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh" (OuterVolumeSpecName: "kube-api-access-n9nbh") pod "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" (UID: "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a"). InnerVolumeSpecName "kube-api-access-n9nbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.386060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" (UID: "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.389688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" (UID: "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.399579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" (UID: "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.411074 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config" (OuterVolumeSpecName: "config") pod "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" (UID: "bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.454403 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.454439 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.454450 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.454460 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.454469 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9nbh\" (UniqueName: \"kubernetes.io/projected/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a-kube-api-access-n9nbh\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.789076 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" event={"ID":"bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a","Type":"ContainerDied","Data":"f3cda89f3eaa826e93386c43ea279d9950dd52da5d500e7ab98308ba8b8379b5"} Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.789635 4909 scope.go:117] "RemoveContainer" containerID="cce80bfa1b8e1aedac1f45906962b59f5d6bf3ebfaef00feb2b7c1145e14460a" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.789172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-x6zzh" Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.840844 4909 generic.go:334] "Generic (PLEG): container finished" podID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerID="a3461a84e16d8c7e9991ea813dec74a6ea027e7a068db3b4ff8dc5979844cf04" exitCode=0 Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.840979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" event={"ID":"c067d444-fbd3-4601-bc51-29f9d7c25afb","Type":"ContainerDied","Data":"a3461a84e16d8c7e9991ea813dec74a6ea027e7a068db3b4ff8dc5979844cf04"} Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.841012 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" event={"ID":"c067d444-fbd3-4601-bc51-29f9d7c25afb","Type":"ContainerStarted","Data":"f903c21bae145312d107fc419b65777478a8ee820b6e0d0f5c20cb311755c691"} Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.895237 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqfxj" event={"ID":"672cc9fe-2684-446f-8c16-59c03ee23678","Type":"ContainerStarted","Data":"f14a8d94e57329faf4f94341d2e01775e8efdcfbbb3a7060d87d92fe0b787fa7"} Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.964664 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-975kr" event={"ID":"8cd6cd1c-99b8-4639-828e-9585790b9d26","Type":"ContainerStarted","Data":"96d53d6877496df812dcd145ec478ebf26d708a59595c121f932e9eec03dfa23"} Dec 01 10:48:18 crc kubenswrapper[4909]: I1201 10:48:18.979644 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qqfxj" podStartSLOduration=2.9796256100000003 podStartE2EDuration="2.97962561s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:18.96668221 +0000 UTC m=+1016.201153108" watchObservedRunningTime="2025-12-01 10:48:18.97962561 +0000 UTC m=+1016.214096508" Dec 01 10:48:19 crc kubenswrapper[4909]: I1201 10:48:19.053237 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:19 crc kubenswrapper[4909]: I1201 10:48:19.075774 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-x6zzh"] Dec 01 10:48:19 crc kubenswrapper[4909]: I1201 10:48:19.269748 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" path="/var/lib/kubelet/pods/bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a/volumes" Dec 01 10:48:19 crc kubenswrapper[4909]: I1201 10:48:19.996271 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" event={"ID":"c067d444-fbd3-4601-bc51-29f9d7c25afb","Type":"ContainerStarted","Data":"5e45189a10fba58bec69595490266926793a6f54cda8f3885692fbda81c0bfe9"} Dec 01 10:48:19 crc kubenswrapper[4909]: I1201 10:48:19.998000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:20 crc kubenswrapper[4909]: I1201 10:48:20.015891 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" podStartSLOduration=4.015857148 podStartE2EDuration="4.015857148s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:20.015523297 +0000 UTC m=+1017.249994195" watchObservedRunningTime="2025-12-01 10:48:20.015857148 +0000 UTC m=+1017.250328046" Dec 01 10:48:22 crc kubenswrapper[4909]: I1201 10:48:22.025167 4909 generic.go:334] "Generic (PLEG): container finished" podID="d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" containerID="c8df12725e92ea8b98a39c1775644a76f065b526f47c544c9f7ac33438dfadc4" exitCode=0 Dec 01 10:48:22 crc kubenswrapper[4909]: I1201 10:48:22.025272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tr79n" event={"ID":"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d","Type":"ContainerDied","Data":"c8df12725e92ea8b98a39c1775644a76f065b526f47c544c9f7ac33438dfadc4"} Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.462439 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.617557 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.617653 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.617752 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.617794 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.617850 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw8m4\" (UniqueName: \"kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.618019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle\") pod \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\" (UID: \"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d\") " Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.627536 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.627593 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4" (OuterVolumeSpecName: "kube-api-access-dw8m4") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "kube-api-access-dw8m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.629133 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts" (OuterVolumeSpecName: "scripts") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.630781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.648200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data" (OuterVolumeSpecName: "config-data") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.654008 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" (UID: "d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720625 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720666 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720677 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720687 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720698 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:25 crc kubenswrapper[4909]: I1201 10:48:25.720710 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw8m4\" (UniqueName: \"kubernetes.io/projected/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d-kube-api-access-dw8m4\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.089362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tr79n" event={"ID":"d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d","Type":"ContainerDied","Data":"e18dbbd774b9dd93676fa7317e55d5b062f015cf7bb02191425df43c0681f6c3"} Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.089429 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tr79n" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.089453 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18dbbd774b9dd93676fa7317e55d5b062f015cf7bb02191425df43c0681f6c3" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.650524 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tr79n"] Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.658399 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tr79n"] Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.748920 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9whz5"] Dec 01 10:48:26 crc kubenswrapper[4909]: E1201 10:48:26.749325 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" containerName="keystone-bootstrap" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.749337 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" containerName="keystone-bootstrap" Dec 01 10:48:26 crc kubenswrapper[4909]: E1201 10:48:26.749353 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" containerName="init" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.749360 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" containerName="init" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.749501 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfaee276-d6bb-4ac1-9c18-556e8a7f4e3a" containerName="init" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.749529 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" containerName="keystone-bootstrap" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.750139 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.752453 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.754982 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fpd6f" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.755148 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.755280 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.755404 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.823847 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9whz5"] Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.850761 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.850853 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.850910 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.850939 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rst\" (UniqueName: \"kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.850968 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.851158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.952298 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.954997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.955073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rst\" (UniqueName: \"kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.955103 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.955156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.955348 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.955608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.962434 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.964233 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.964701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.966815 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.971901 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:26 crc kubenswrapper[4909]: I1201 10:48:26.995448 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rst\" (UniqueName: \"kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst\") pod \"keystone-bootstrap-9whz5\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:27 crc kubenswrapper[4909]: I1201 10:48:27.059798 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:27 crc kubenswrapper[4909]: I1201 10:48:27.060194 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" containerID="cri-o://7b493675055357c98861d160bf70397a119d36ec2c499b273a3f437bab2adde7" gracePeriod=10 Dec 01 10:48:27 crc kubenswrapper[4909]: I1201 10:48:27.083957 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:27 crc kubenswrapper[4909]: I1201 10:48:27.277546 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d" path="/var/lib/kubelet/pods/d3e31dc2-2e10-4a6f-a54b-2dbfc2c2b17d/volumes" Dec 01 10:48:28 crc kubenswrapper[4909]: I1201 10:48:28.125175 4909 generic.go:334] "Generic (PLEG): container finished" podID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerID="7b493675055357c98861d160bf70397a119d36ec2c499b273a3f437bab2adde7" exitCode=0 Dec 01 10:48:28 crc kubenswrapper[4909]: I1201 10:48:28.125646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerDied","Data":"7b493675055357c98861d160bf70397a119d36ec2c499b273a3f437bab2adde7"} Dec 01 10:48:28 crc kubenswrapper[4909]: I1201 10:48:28.155956 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Dec 01 10:48:33 crc kubenswrapper[4909]: I1201 10:48:33.156763 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Dec 01 10:48:37 crc kubenswrapper[4909]: E1201 10:48:37.688536 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 10:48:37 crc kubenswrapper[4909]: E1201 10:48:37.689251 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wh9sz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6nt8w_openstack(00cc5cc5-c22b-4523-96c5-baa507ad0ce1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:48:37 crc kubenswrapper[4909]: E1201 10:48:37.690523 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6nt8w" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" Dec 01 10:48:38 crc kubenswrapper[4909]: E1201 10:48:38.216073 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6nt8w" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" Dec 01 10:48:38 crc kubenswrapper[4909]: E1201 10:48:38.255715 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 10:48:38 crc kubenswrapper[4909]: E1201 10:48:38.257218 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bs6nd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-975kr_openstack(8cd6cd1c-99b8-4639-828e-9585790b9d26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:48:38 crc kubenswrapper[4909]: E1201 10:48:38.258733 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-975kr" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.463033 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.593815 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb\") pod \"77987e39-1fa6-44c9-9f33-c066775eae5a\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.594158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb\") pod \"77987e39-1fa6-44c9-9f33-c066775eae5a\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.594235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgkql\" (UniqueName: \"kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql\") pod \"77987e39-1fa6-44c9-9f33-c066775eae5a\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.594414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc\") pod \"77987e39-1fa6-44c9-9f33-c066775eae5a\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.594450 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config\") pod \"77987e39-1fa6-44c9-9f33-c066775eae5a\" (UID: \"77987e39-1fa6-44c9-9f33-c066775eae5a\") " Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.600259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql" (OuterVolumeSpecName: "kube-api-access-xgkql") pod "77987e39-1fa6-44c9-9f33-c066775eae5a" (UID: "77987e39-1fa6-44c9-9f33-c066775eae5a"). InnerVolumeSpecName "kube-api-access-xgkql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.641848 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77987e39-1fa6-44c9-9f33-c066775eae5a" (UID: "77987e39-1fa6-44c9-9f33-c066775eae5a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.641918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77987e39-1fa6-44c9-9f33-c066775eae5a" (UID: "77987e39-1fa6-44c9-9f33-c066775eae5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.642087 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77987e39-1fa6-44c9-9f33-c066775eae5a" (UID: "77987e39-1fa6-44c9-9f33-c066775eae5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.643330 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config" (OuterVolumeSpecName: "config") pod "77987e39-1fa6-44c9-9f33-c066775eae5a" (UID: "77987e39-1fa6-44c9-9f33-c066775eae5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.697670 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.697711 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.697724 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.697737 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77987e39-1fa6-44c9-9f33-c066775eae5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.697749 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgkql\" (UniqueName: \"kubernetes.io/projected/77987e39-1fa6-44c9-9f33-c066775eae5a-kube-api-access-xgkql\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:38 crc kubenswrapper[4909]: I1201 10:48:38.733828 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9whz5"] Dec 01 10:48:38 crc kubenswrapper[4909]: W1201 10:48:38.739472 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbebc139_adb5_4785_8db0_283b85b33fda.slice/crio-3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18 WatchSource:0}: Error finding container 3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18: Status 404 returned error can't find the container with id 3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18 Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.224915 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" event={"ID":"77987e39-1fa6-44c9-9f33-c066775eae5a","Type":"ContainerDied","Data":"290efa853a44a1527bcb1f03fa3c2196e07e1855ec657e9b61a426ada3af305b"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.225063 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.225650 4909 scope.go:117] "RemoveContainer" containerID="7b493675055357c98861d160bf70397a119d36ec2c499b273a3f437bab2adde7" Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.231391 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2jwcr" event={"ID":"e36ea017-d935-4011-8590-0aaa002229de","Type":"ContainerStarted","Data":"79dd6ce6aac679be3bd5da57d44077b6af82cddf551a6466138d7285524d5ef3"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.235498 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9whz5" event={"ID":"cbebc139-adb5-4785-8db0-283b85b33fda","Type":"ContainerStarted","Data":"b417a1c144ac4d4fa82c5157225f49b226a8eca42b3b220d12001f4ad0fe1e95"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.235551 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9whz5" event={"ID":"cbebc139-adb5-4785-8db0-283b85b33fda","Type":"ContainerStarted","Data":"3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.243355 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerStarted","Data":"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.245588 4909 generic.go:334] "Generic (PLEG): container finished" podID="672cc9fe-2684-446f-8c16-59c03ee23678" containerID="f14a8d94e57329faf4f94341d2e01775e8efdcfbbb3a7060d87d92fe0b787fa7" exitCode=0 Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.246736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqfxj" event={"ID":"672cc9fe-2684-446f-8c16-59c03ee23678","Type":"ContainerDied","Data":"f14a8d94e57329faf4f94341d2e01775e8efdcfbbb3a7060d87d92fe0b787fa7"} Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.257186 4909 scope.go:117] "RemoveContainer" containerID="9a72c4e1c1c17f9dd6950b27c0afc4b821b7d48906714405808f12e577ae636d" Dec 01 10:48:39 crc kubenswrapper[4909]: E1201 10:48:39.257249 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-975kr" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.277968 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2jwcr" podStartSLOduration=2.611493593 podStartE2EDuration="23.277950509s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="2025-12-01 10:48:17.612155463 +0000 UTC m=+1014.846626361" lastFinishedPulling="2025-12-01 10:48:38.278612379 +0000 UTC m=+1035.513083277" observedRunningTime="2025-12-01 10:48:39.25735445 +0000 UTC m=+1036.491825348" watchObservedRunningTime="2025-12-01 10:48:39.277950509 +0000 UTC m=+1036.512421407" Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.306318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.330995 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-xgjwv"] Dec 01 10:48:39 crc kubenswrapper[4909]: I1201 10:48:39.343618 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9whz5" podStartSLOduration=13.343589201 podStartE2EDuration="13.343589201s" podCreationTimestamp="2025-12-01 10:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:39.333658948 +0000 UTC m=+1036.568129856" watchObservedRunningTime="2025-12-01 10:48:39.343589201 +0000 UTC m=+1036.578060099" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.259464 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerStarted","Data":"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d"} Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.608168 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.759020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config\") pod \"672cc9fe-2684-446f-8c16-59c03ee23678\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.760069 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkgfp\" (UniqueName: \"kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp\") pod \"672cc9fe-2684-446f-8c16-59c03ee23678\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.760113 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle\") pod \"672cc9fe-2684-446f-8c16-59c03ee23678\" (UID: \"672cc9fe-2684-446f-8c16-59c03ee23678\") " Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.764836 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp" (OuterVolumeSpecName: "kube-api-access-qkgfp") pod "672cc9fe-2684-446f-8c16-59c03ee23678" (UID: "672cc9fe-2684-446f-8c16-59c03ee23678"). InnerVolumeSpecName "kube-api-access-qkgfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.783244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config" (OuterVolumeSpecName: "config") pod "672cc9fe-2684-446f-8c16-59c03ee23678" (UID: "672cc9fe-2684-446f-8c16-59c03ee23678"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.785848 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "672cc9fe-2684-446f-8c16-59c03ee23678" (UID: "672cc9fe-2684-446f-8c16-59c03ee23678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.862351 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkgfp\" (UniqueName: \"kubernetes.io/projected/672cc9fe-2684-446f-8c16-59c03ee23678-kube-api-access-qkgfp\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.862696 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:40 crc kubenswrapper[4909]: I1201 10:48:40.862777 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/672cc9fe-2684-446f-8c16-59c03ee23678-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.286561 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" path="/var/lib/kubelet/pods/77987e39-1fa6-44c9-9f33-c066775eae5a/volumes" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.293291 4909 generic.go:334] "Generic (PLEG): container finished" podID="e36ea017-d935-4011-8590-0aaa002229de" containerID="79dd6ce6aac679be3bd5da57d44077b6af82cddf551a6466138d7285524d5ef3" exitCode=0 Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.293760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2jwcr" event={"ID":"e36ea017-d935-4011-8590-0aaa002229de","Type":"ContainerDied","Data":"79dd6ce6aac679be3bd5da57d44077b6af82cddf551a6466138d7285524d5ef3"} Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.296150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qqfxj" event={"ID":"672cc9fe-2684-446f-8c16-59c03ee23678","Type":"ContainerDied","Data":"899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3"} Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.296271 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899066b0f299cf15b8066aadffb711d54ef28b8f5592f98e9a20cfa2a775a4f3" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.296465 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qqfxj" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.684715 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:48:41 crc kubenswrapper[4909]: E1201 10:48:41.689654 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.689677 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" Dec 01 10:48:41 crc kubenswrapper[4909]: E1201 10:48:41.689704 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="init" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.689709 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="init" Dec 01 10:48:41 crc kubenswrapper[4909]: E1201 10:48:41.689720 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672cc9fe-2684-446f-8c16-59c03ee23678" containerName="neutron-db-sync" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.689729 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="672cc9fe-2684-446f-8c16-59c03ee23678" containerName="neutron-db-sync" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.689926 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.689944 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="672cc9fe-2684-446f-8c16-59c03ee23678" containerName="neutron-db-sync" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.690954 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.788217 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.788820 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.788905 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.788969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.789093 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4hp\" (UniqueName: \"kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.804228 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.883593 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.886951 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.890564 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4hp\" (UniqueName: \"kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.890606 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.890730 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.890851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.890921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.891972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.891993 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.894365 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.894953 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.896079 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.896331 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.896444 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-46hkv" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.897991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.898484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:41 crc kubenswrapper[4909]: I1201 10:48:41.925623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4hp\" (UniqueName: \"kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp\") pod \"dnsmasq-dns-7b946d459c-hv278\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:41.999938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:41.999998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.000066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.000112 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjth7\" (UniqueName: \"kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.000137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.059960 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.101494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.101567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjth7\" (UniqueName: \"kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.101598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.101648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.101671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.108630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.109625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.110318 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.132753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjth7\" (UniqueName: \"kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.135984 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config\") pod \"neutron-6bd476784b-fck6x\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.276967 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.654174 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.730292 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.844577 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs\") pod \"e36ea017-d935-4011-8590-0aaa002229de\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.845323 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs" (OuterVolumeSpecName: "logs") pod "e36ea017-d935-4011-8590-0aaa002229de" (UID: "e36ea017-d935-4011-8590-0aaa002229de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.845534 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckjp\" (UniqueName: \"kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp\") pod \"e36ea017-d935-4011-8590-0aaa002229de\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.845572 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data\") pod \"e36ea017-d935-4011-8590-0aaa002229de\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.846343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle\") pod \"e36ea017-d935-4011-8590-0aaa002229de\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.846461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts\") pod \"e36ea017-d935-4011-8590-0aaa002229de\" (UID: \"e36ea017-d935-4011-8590-0aaa002229de\") " Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.847071 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36ea017-d935-4011-8590-0aaa002229de-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.850054 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp" (OuterVolumeSpecName: "kube-api-access-vckjp") pod "e36ea017-d935-4011-8590-0aaa002229de" (UID: "e36ea017-d935-4011-8590-0aaa002229de"). InnerVolumeSpecName "kube-api-access-vckjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.858057 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts" (OuterVolumeSpecName: "scripts") pod "e36ea017-d935-4011-8590-0aaa002229de" (UID: "e36ea017-d935-4011-8590-0aaa002229de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.886886 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data" (OuterVolumeSpecName: "config-data") pod "e36ea017-d935-4011-8590-0aaa002229de" (UID: "e36ea017-d935-4011-8590-0aaa002229de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.903173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e36ea017-d935-4011-8590-0aaa002229de" (UID: "e36ea017-d935-4011-8590-0aaa002229de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.955120 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckjp\" (UniqueName: \"kubernetes.io/projected/e36ea017-d935-4011-8590-0aaa002229de-kube-api-access-vckjp\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.955168 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.955182 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:42 crc kubenswrapper[4909]: I1201 10:48:42.955191 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36ea017-d935-4011-8590-0aaa002229de-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.040565 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.158128 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-xgjwv" podUID="77987e39-1fa6-44c9-9f33-c066775eae5a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.332372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2jwcr" event={"ID":"e36ea017-d935-4011-8590-0aaa002229de","Type":"ContainerDied","Data":"02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705"} Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.332421 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02b30179d0e47693f55e450e87936e8bb058e3557a2164a4b2f01d86e532a705" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.332504 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2jwcr" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.337389 4909 generic.go:334] "Generic (PLEG): container finished" podID="cbebc139-adb5-4785-8db0-283b85b33fda" containerID="b417a1c144ac4d4fa82c5157225f49b226a8eca42b3b220d12001f4ad0fe1e95" exitCode=0 Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.337467 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9whz5" event={"ID":"cbebc139-adb5-4785-8db0-283b85b33fda","Type":"ContainerDied","Data":"b417a1c144ac4d4fa82c5157225f49b226a8eca42b3b220d12001f4ad0fe1e95"} Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.343497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hv278" event={"ID":"53da20b2-7c45-49d6-9964-4b495bfea701","Type":"ContainerStarted","Data":"1af850c4ac5c51eb6123be269d37cd8009f8028804b4729ac76cca18d1591c2a"} Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.510535 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7548d5fdbb-tbnwp"] Dec 01 10:48:43 crc kubenswrapper[4909]: E1201 10:48:43.513676 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36ea017-d935-4011-8590-0aaa002229de" containerName="placement-db-sync" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.513721 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36ea017-d935-4011-8590-0aaa002229de" containerName="placement-db-sync" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.514001 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36ea017-d935-4011-8590-0aaa002229de" containerName="placement-db-sync" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.516001 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.518358 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.520327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.521055 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2gvtb" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.521238 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.525069 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7548d5fdbb-tbnwp"] Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.527674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673533 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sll78\" (UniqueName: \"kubernetes.io/projected/5e0dedc9-48c5-448f-9d8f-c2eb51401705-kube-api-access-sll78\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673832 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-internal-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-public-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-scripts\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0dedc9-48c5-448f-9d8f-c2eb51401705-logs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.673996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-config-data\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.674011 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-combined-ca-bundle\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.776904 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-scripts\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.777005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0dedc9-48c5-448f-9d8f-c2eb51401705-logs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.777955 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0dedc9-48c5-448f-9d8f-c2eb51401705-logs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.778008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-config-data\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.778031 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-combined-ca-bundle\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.778094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sll78\" (UniqueName: \"kubernetes.io/projected/5e0dedc9-48c5-448f-9d8f-c2eb51401705-kube-api-access-sll78\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.778228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-internal-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.779009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-public-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.782274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-scripts\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.784204 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-combined-ca-bundle\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.784299 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-public-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.784321 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-config-data\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.793408 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0dedc9-48c5-448f-9d8f-c2eb51401705-internal-tls-certs\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.797112 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sll78\" (UniqueName: \"kubernetes.io/projected/5e0dedc9-48c5-448f-9d8f-c2eb51401705-kube-api-access-sll78\") pod \"placement-7548d5fdbb-tbnwp\" (UID: \"5e0dedc9-48c5-448f-9d8f-c2eb51401705\") " pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:43 crc kubenswrapper[4909]: I1201 10:48:43.858751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.598563 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68f697f85-549sp"] Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.601438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.604043 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.605216 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.615203 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f697f85-549sp"] Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.699617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-ovndb-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.699668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.699711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-internal-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.699908 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-combined-ca-bundle\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.700016 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-httpd-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.700396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-public-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.700435 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5dp\" (UniqueName: \"kubernetes.io/projected/65cec39f-6062-43a3-bf73-d7091a16e0a0-kube-api-access-hb5dp\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.802712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-internal-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.802805 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-combined-ca-bundle\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.802853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-httpd-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.803082 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-public-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.803107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5dp\" (UniqueName: \"kubernetes.io/projected/65cec39f-6062-43a3-bf73-d7091a16e0a0-kube-api-access-hb5dp\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.803217 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-ovndb-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.803246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.809683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-ovndb-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.810682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-combined-ca-bundle\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.811207 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-internal-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.811328 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-public-tls-certs\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.819798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-httpd-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.822937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/65cec39f-6062-43a3-bf73-d7091a16e0a0-config\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.827038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5dp\" (UniqueName: \"kubernetes.io/projected/65cec39f-6062-43a3-bf73-d7091a16e0a0-kube-api-access-hb5dp\") pod \"neutron-68f697f85-549sp\" (UID: \"65cec39f-6062-43a3-bf73-d7091a16e0a0\") " pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:44 crc kubenswrapper[4909]: I1201 10:48:44.935781 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.821755 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.944676 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.946565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.947540 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.948245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.948301 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rst\" (UniqueName: \"kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.948338 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts\") pod \"cbebc139-adb5-4785-8db0-283b85b33fda\" (UID: \"cbebc139-adb5-4785-8db0-283b85b33fda\") " Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.956187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.957500 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts" (OuterVolumeSpecName: "scripts") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.958208 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst" (OuterVolumeSpecName: "kube-api-access-t5rst") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "kube-api-access-t5rst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:46 crc kubenswrapper[4909]: I1201 10:48:46.962024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.000001 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.000172 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data" (OuterVolumeSpecName: "config-data") pod "cbebc139-adb5-4785-8db0-283b85b33fda" (UID: "cbebc139-adb5-4785-8db0-283b85b33fda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.005458 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7548d5fdbb-tbnwp"] Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051202 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5rst\" (UniqueName: \"kubernetes.io/projected/cbebc139-adb5-4785-8db0-283b85b33fda-kube-api-access-t5rst\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051256 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051272 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051283 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051294 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.051305 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbebc139-adb5-4785-8db0-283b85b33fda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:47 crc kubenswrapper[4909]: W1201 10:48:47.058037 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0dedc9_48c5_448f_9d8f_c2eb51401705.slice/crio-10653f7a984c05ae15be3422ce73bcb4420edaa70fa521b42b899cf2938e24cf WatchSource:0}: Error finding container 10653f7a984c05ae15be3422ce73bcb4420edaa70fa521b42b899cf2938e24cf: Status 404 returned error can't find the container with id 10653f7a984c05ae15be3422ce73bcb4420edaa70fa521b42b899cf2938e24cf Dec 01 10:48:47 crc kubenswrapper[4909]: W1201 10:48:47.205414 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65cec39f_6062_43a3_bf73_d7091a16e0a0.slice/crio-18126c25c0a0be13afb8a371e9b58d25b09e271d522a370a14392e563d5da58e WatchSource:0}: Error finding container 18126c25c0a0be13afb8a371e9b58d25b09e271d522a370a14392e563d5da58e: Status 404 returned error can't find the container with id 18126c25c0a0be13afb8a371e9b58d25b09e271d522a370a14392e563d5da58e Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.208503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f697f85-549sp"] Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.390238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f697f85-549sp" event={"ID":"65cec39f-6062-43a3-bf73-d7091a16e0a0","Type":"ContainerStarted","Data":"18126c25c0a0be13afb8a371e9b58d25b09e271d522a370a14392e563d5da58e"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.391611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7548d5fdbb-tbnwp" event={"ID":"5e0dedc9-48c5-448f-9d8f-c2eb51401705","Type":"ContainerStarted","Data":"0ed875a0452b123ff61fb714ffc99f3a11a3bd84325299e714bed858bd1ddb32"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.391643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7548d5fdbb-tbnwp" event={"ID":"5e0dedc9-48c5-448f-9d8f-c2eb51401705","Type":"ContainerStarted","Data":"10653f7a984c05ae15be3422ce73bcb4420edaa70fa521b42b899cf2938e24cf"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.394030 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9whz5" event={"ID":"cbebc139-adb5-4785-8db0-283b85b33fda","Type":"ContainerDied","Data":"3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.394055 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e1819eb91149c8ccd5632a6bff026d292c455c6c809a5b4efdbd7334900fc18" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.394130 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9whz5" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.397997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerStarted","Data":"400c05162b9957e073998220e0a3387fb1f55f81e54c5e184da6075d48e2b8d2"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.398040 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerStarted","Data":"219c64bf1be97806f47c2029fd3211b3edd3a7b777d1e2c3e365744010130179"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.398052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerStarted","Data":"07f45b2c884753c4116383e26e5f00ac677261bb1df3fb775ebe1e32c84dd98c"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.399164 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.409247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerStarted","Data":"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.412584 4909 generic.go:334] "Generic (PLEG): container finished" podID="53da20b2-7c45-49d6-9964-4b495bfea701" containerID="654d40d814da071fafd37efd00e69610a2539ba7a14ee90a7615a5f5c55d3eb9" exitCode=0 Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.412630 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hv278" event={"ID":"53da20b2-7c45-49d6-9964-4b495bfea701","Type":"ContainerDied","Data":"654d40d814da071fafd37efd00e69610a2539ba7a14ee90a7615a5f5c55d3eb9"} Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.432005 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bd476784b-fck6x" podStartSLOduration=6.431988168 podStartE2EDuration="6.431988168s" podCreationTimestamp="2025-12-01 10:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:47.427965353 +0000 UTC m=+1044.662436271" watchObservedRunningTime="2025-12-01 10:48:47.431988168 +0000 UTC m=+1044.666459066" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.941845 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-58dc5cfbbd-v7pkq"] Dec 01 10:48:47 crc kubenswrapper[4909]: E1201 10:48:47.942836 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebc139-adb5-4785-8db0-283b85b33fda" containerName="keystone-bootstrap" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.942857 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebc139-adb5-4785-8db0-283b85b33fda" containerName="keystone-bootstrap" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.943151 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbebc139-adb5-4785-8db0-283b85b33fda" containerName="keystone-bootstrap" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.943952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.959615 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.959891 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.960108 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.960383 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.960569 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.960687 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fpd6f" Dec 01 10:48:47 crc kubenswrapper[4909]: I1201 10:48:47.964368 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58dc5cfbbd-v7pkq"] Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093011 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxj8\" (UniqueName: \"kubernetes.io/projected/c6bb8d66-286d-4137-ab50-a69fec49ab3c-kube-api-access-njxj8\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093079 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-scripts\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093112 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-credential-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-internal-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-combined-ca-bundle\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093198 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-config-data\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093212 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-public-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.093263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-fernet-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.194935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-scripts\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.194997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-credential-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195031 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-internal-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-combined-ca-bundle\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-config-data\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-public-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195166 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-fernet-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.195209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxj8\" (UniqueName: \"kubernetes.io/projected/c6bb8d66-286d-4137-ab50-a69fec49ab3c-kube-api-access-njxj8\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.206696 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-public-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.206728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-fernet-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.207050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-combined-ca-bundle\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.207565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-internal-tls-certs\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.207695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-credential-keys\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.208154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-config-data\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.213823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bb8d66-286d-4137-ab50-a69fec49ab3c-scripts\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.215446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxj8\" (UniqueName: \"kubernetes.io/projected/c6bb8d66-286d-4137-ab50-a69fec49ab3c-kube-api-access-njxj8\") pod \"keystone-58dc5cfbbd-v7pkq\" (UID: \"c6bb8d66-286d-4137-ab50-a69fec49ab3c\") " pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.280158 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.429485 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f697f85-549sp" event={"ID":"65cec39f-6062-43a3-bf73-d7091a16e0a0","Type":"ContainerStarted","Data":"24892e37455408bee07446db1e1e7da7453023d8661af057136c07d39bf1014d"} Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.429566 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f697f85-549sp" event={"ID":"65cec39f-6062-43a3-bf73-d7091a16e0a0","Type":"ContainerStarted","Data":"91509081c924349a2984d4d4fdc2e40ca257b4a02fd4b13a010d24d52bb3b0ef"} Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.429617 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68f697f85-549sp" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.466232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7548d5fdbb-tbnwp" event={"ID":"5e0dedc9-48c5-448f-9d8f-c2eb51401705","Type":"ContainerStarted","Data":"c2bb10dd4d9ff55b024a9dd63067d12e0ad1b4cc16d604f82cd91032ed413a39"} Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.467527 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.467556 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.473028 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68f697f85-549sp" podStartSLOduration=4.47300101 podStartE2EDuration="4.47300101s" podCreationTimestamp="2025-12-01 10:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:48.456569248 +0000 UTC m=+1045.691040146" watchObservedRunningTime="2025-12-01 10:48:48.47300101 +0000 UTC m=+1045.707471908" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.489945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hv278" event={"ID":"53da20b2-7c45-49d6-9964-4b495bfea701","Type":"ContainerStarted","Data":"71f492ea1b84428a90c559b3f468226616e84252068deda39c98d1c21e8972cb"} Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.490349 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.498063 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7548d5fdbb-tbnwp" podStartSLOduration=5.498039251 podStartE2EDuration="5.498039251s" podCreationTimestamp="2025-12-01 10:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:48.492497288 +0000 UTC m=+1045.726968206" watchObservedRunningTime="2025-12-01 10:48:48.498039251 +0000 UTC m=+1045.732510149" Dec 01 10:48:48 crc kubenswrapper[4909]: I1201 10:48:48.533586 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-hv278" podStartSLOduration=7.533558798 podStartE2EDuration="7.533558798s" podCreationTimestamp="2025-12-01 10:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:48.525827296 +0000 UTC m=+1045.760298194" watchObservedRunningTime="2025-12-01 10:48:48.533558798 +0000 UTC m=+1045.768029696" Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:48.896216 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-58dc5cfbbd-v7pkq"] Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:49.507170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58dc5cfbbd-v7pkq" event={"ID":"c6bb8d66-286d-4137-ab50-a69fec49ab3c","Type":"ContainerStarted","Data":"39eacb336ac0f2028c19a3080d2b56a97d33a8fa841ffdb2cc2d3c31dd958664"} Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:50.515472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-58dc5cfbbd-v7pkq" event={"ID":"c6bb8d66-286d-4137-ab50-a69fec49ab3c","Type":"ContainerStarted","Data":"87426921ba1cb6fc028d8e556e2d75731583778527e528108de7bd2cb4b700af"} Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:50.544366 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-58dc5cfbbd-v7pkq" podStartSLOduration=3.544346093 podStartE2EDuration="3.544346093s" podCreationTimestamp="2025-12-01 10:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:48:50.534411573 +0000 UTC m=+1047.768882501" watchObservedRunningTime="2025-12-01 10:48:50.544346093 +0000 UTC m=+1047.778817001" Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:51.523356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:52.064031 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:52.133482 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:52.133771 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="dnsmasq-dns" containerID="cri-o://5e45189a10fba58bec69595490266926793a6f54cda8f3885692fbda81c0bfe9" gracePeriod=10 Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:52.539328 4909 generic.go:334] "Generic (PLEG): container finished" podID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerID="5e45189a10fba58bec69595490266926793a6f54cda8f3885692fbda81c0bfe9" exitCode=0 Dec 01 10:48:52 crc kubenswrapper[4909]: I1201 10:48:52.539531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" event={"ID":"c067d444-fbd3-4601-bc51-29f9d7c25afb","Type":"ContainerDied","Data":"5e45189a10fba58bec69595490266926793a6f54cda8f3885692fbda81c0bfe9"} Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.750224 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.864176 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb\") pod \"c067d444-fbd3-4601-bc51-29f9d7c25afb\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.864711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb\") pod \"c067d444-fbd3-4601-bc51-29f9d7c25afb\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.864817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46knm\" (UniqueName: \"kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm\") pod \"c067d444-fbd3-4601-bc51-29f9d7c25afb\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.864937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config\") pod \"c067d444-fbd3-4601-bc51-29f9d7c25afb\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.865019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc\") pod \"c067d444-fbd3-4601-bc51-29f9d7c25afb\" (UID: \"c067d444-fbd3-4601-bc51-29f9d7c25afb\") " Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.874086 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm" (OuterVolumeSpecName: "kube-api-access-46knm") pod "c067d444-fbd3-4601-bc51-29f9d7c25afb" (UID: "c067d444-fbd3-4601-bc51-29f9d7c25afb"). InnerVolumeSpecName "kube-api-access-46knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.911354 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config" (OuterVolumeSpecName: "config") pod "c067d444-fbd3-4601-bc51-29f9d7c25afb" (UID: "c067d444-fbd3-4601-bc51-29f9d7c25afb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.912135 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c067d444-fbd3-4601-bc51-29f9d7c25afb" (UID: "c067d444-fbd3-4601-bc51-29f9d7c25afb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.913087 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c067d444-fbd3-4601-bc51-29f9d7c25afb" (UID: "c067d444-fbd3-4601-bc51-29f9d7c25afb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.922938 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c067d444-fbd3-4601-bc51-29f9d7c25afb" (UID: "c067d444-fbd3-4601-bc51-29f9d7c25afb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.968285 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.968340 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.968354 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.968368 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46knm\" (UniqueName: \"kubernetes.io/projected/c067d444-fbd3-4601-bc51-29f9d7c25afb-kube-api-access-46knm\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:56 crc kubenswrapper[4909]: I1201 10:48:56.968378 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c067d444-fbd3-4601-bc51-29f9d7c25afb-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.588103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" event={"ID":"c067d444-fbd3-4601-bc51-29f9d7c25afb","Type":"ContainerDied","Data":"f903c21bae145312d107fc419b65777478a8ee820b6e0d0f5c20cb311755c691"} Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.588420 4909 scope.go:117] "RemoveContainer" containerID="5e45189a10fba58bec69595490266926793a6f54cda8f3885692fbda81c0bfe9" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.588548 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-xqrbt" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594044 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerStarted","Data":"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077"} Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594253 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-central-agent" containerID="cri-o://fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af" gracePeriod=30 Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594534 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594599 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="proxy-httpd" containerID="cri-o://412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077" gracePeriod=30 Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594662 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="sg-core" containerID="cri-o://39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96" gracePeriod=30 Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.594735 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-notification-agent" containerID="cri-o://616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d" gracePeriod=30 Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.603186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6nt8w" event={"ID":"00cc5cc5-c22b-4523-96c5-baa507ad0ce1","Type":"ContainerStarted","Data":"0eb9e3c4665fd1d6854ac6b37fbe4826ac4563d9e160e20e11d8b7c643d33471"} Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.611707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-975kr" event={"ID":"8cd6cd1c-99b8-4639-828e-9585790b9d26","Type":"ContainerStarted","Data":"8711a9b6a33a1ddee459a43a67208d9e582e2890e46890cd4d3ca3e677d3e22a"} Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.619234 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.624841 4909 scope.go:117] "RemoveContainer" containerID="a3461a84e16d8c7e9991ea813dec74a6ea027e7a068db3b4ff8dc5979844cf04" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.628823 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-xqrbt"] Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.644243 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.391105423 podStartE2EDuration="41.644214806s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="2025-12-01 10:48:17.281695259 +0000 UTC m=+1014.516166157" lastFinishedPulling="2025-12-01 10:48:56.534804642 +0000 UTC m=+1053.769275540" observedRunningTime="2025-12-01 10:48:57.639088896 +0000 UTC m=+1054.873559804" watchObservedRunningTime="2025-12-01 10:48:57.644214806 +0000 UTC m=+1054.878685704" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.664976 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-975kr" podStartSLOduration=2.9674979500000003 podStartE2EDuration="41.664953143s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="2025-12-01 10:48:17.815592781 +0000 UTC m=+1015.050063679" lastFinishedPulling="2025-12-01 10:48:56.513047964 +0000 UTC m=+1053.747518872" observedRunningTime="2025-12-01 10:48:57.657863651 +0000 UTC m=+1054.892334569" watchObservedRunningTime="2025-12-01 10:48:57.664953143 +0000 UTC m=+1054.899424051" Dec 01 10:48:57 crc kubenswrapper[4909]: I1201 10:48:57.677076 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6nt8w" podStartSLOduration=2.429919089 podStartE2EDuration="41.67705648s" podCreationTimestamp="2025-12-01 10:48:16 +0000 UTC" firstStartedPulling="2025-12-01 10:48:17.271116716 +0000 UTC m=+1014.505587614" lastFinishedPulling="2025-12-01 10:48:56.518254117 +0000 UTC m=+1053.752725005" observedRunningTime="2025-12-01 10:48:57.673321553 +0000 UTC m=+1054.907792471" watchObservedRunningTime="2025-12-01 10:48:57.67705648 +0000 UTC m=+1054.911527378" Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624431 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce831ec4-3f79-4a97-856b-00438d195fac" containerID="412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077" exitCode=0 Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624921 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce831ec4-3f79-4a97-856b-00438d195fac" containerID="39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96" exitCode=2 Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624937 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce831ec4-3f79-4a97-856b-00438d195fac" containerID="fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af" exitCode=0 Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624491 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerDied","Data":"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077"} Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerDied","Data":"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96"} Dec 01 10:48:58 crc kubenswrapper[4909]: I1201 10:48:58.624980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerDied","Data":"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af"} Dec 01 10:48:59 crc kubenswrapper[4909]: I1201 10:48:59.279804 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" path="/var/lib/kubelet/pods/c067d444-fbd3-4601-bc51-29f9d7c25afb/volumes" Dec 01 10:48:59 crc kubenswrapper[4909]: I1201 10:48:59.636953 4909 generic.go:334] "Generic (PLEG): container finished" podID="8cd6cd1c-99b8-4639-828e-9585790b9d26" containerID="8711a9b6a33a1ddee459a43a67208d9e582e2890e46890cd4d3ca3e677d3e22a" exitCode=0 Dec 01 10:48:59 crc kubenswrapper[4909]: I1201 10:48:59.637023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-975kr" event={"ID":"8cd6cd1c-99b8-4639-828e-9585790b9d26","Type":"ContainerDied","Data":"8711a9b6a33a1ddee459a43a67208d9e582e2890e46890cd4d3ca3e677d3e22a"} Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.510552 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.533479 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.533691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.533765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.533851 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.534039 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.534135 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.534199 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82dgz\" (UniqueName: \"kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz\") pod \"ce831ec4-3f79-4a97-856b-00438d195fac\" (UID: \"ce831ec4-3f79-4a97-856b-00438d195fac\") " Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.545238 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz" (OuterVolumeSpecName: "kube-api-access-82dgz") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "kube-api-access-82dgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.545425 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.548044 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.556040 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts" (OuterVolumeSpecName: "scripts") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.600248 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.637073 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82dgz\" (UniqueName: \"kubernetes.io/projected/ce831ec4-3f79-4a97-856b-00438d195fac-kube-api-access-82dgz\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.637107 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.637116 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.637129 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.637139 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce831ec4-3f79-4a97-856b-00438d195fac-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.645888 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data" (OuterVolumeSpecName: "config-data") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.647784 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce831ec4-3f79-4a97-856b-00438d195fac" (UID: "ce831ec4-3f79-4a97-856b-00438d195fac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.649050 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce831ec4-3f79-4a97-856b-00438d195fac" containerID="616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d" exitCode=0 Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.649131 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.649150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerDied","Data":"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d"} Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.649214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce831ec4-3f79-4a97-856b-00438d195fac","Type":"ContainerDied","Data":"c39ee7cbe3fe062ae4d5bd3ed31049098069a5f8dc73ca86336a70bd9962b1ae"} Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.649240 4909 scope.go:117] "RemoveContainer" containerID="412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.681237 4909 scope.go:117] "RemoveContainer" containerID="39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.690719 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.700994 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.716980 4909 scope.go:117] "RemoveContainer" containerID="616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.720409 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723574 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="init" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723607 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="init" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723625 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="sg-core" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723640 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="sg-core" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723666 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-notification-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723673 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-notification-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723686 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-central-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723693 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-central-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723722 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="dnsmasq-dns" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723730 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="dnsmasq-dns" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.723749 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="proxy-httpd" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.723756 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="proxy-httpd" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.724046 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c067d444-fbd3-4601-bc51-29f9d7c25afb" containerName="dnsmasq-dns" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.724099 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="proxy-httpd" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.724117 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-notification-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.724138 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="ceilometer-central-agent" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.724154 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" containerName="sg-core" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.727713 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.731170 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.734313 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.737727 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.746555 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9ws\" (UniqueName: \"kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747604 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747745 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747798 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747889 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.747904 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce831ec4-3f79-4a97-856b-00438d195fac-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.754968 4909 scope.go:117] "RemoveContainer" containerID="fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.777046 4909 scope.go:117] "RemoveContainer" containerID="412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.777449 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077\": container with ID starting with 412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077 not found: ID does not exist" containerID="412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.777481 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077"} err="failed to get container status \"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077\": rpc error: code = NotFound desc = could not find container \"412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077\": container with ID starting with 412fafababbcc18263156c61b90ad1dec5e97037eb1c2eba8af0fd071718b077 not found: ID does not exist" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.777504 4909 scope.go:117] "RemoveContainer" containerID="39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.777699 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96\": container with ID starting with 39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96 not found: ID does not exist" containerID="39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.777732 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96"} err="failed to get container status \"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96\": rpc error: code = NotFound desc = could not find container \"39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96\": container with ID starting with 39b37a78ac9c0cc981fcb4959e819bd8bca5a88b4565b3af90762f8cecdf2e96 not found: ID does not exist" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.777745 4909 scope.go:117] "RemoveContainer" containerID="616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.778058 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d\": container with ID starting with 616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d not found: ID does not exist" containerID="616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.778134 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d"} err="failed to get container status \"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d\": rpc error: code = NotFound desc = could not find container \"616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d\": container with ID starting with 616a3a90342bf6925dda980d23e00f3e02ddc62d7fb6d9cf7ca907b91c8db23d not found: ID does not exist" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.778163 4909 scope.go:117] "RemoveContainer" containerID="fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af" Dec 01 10:49:00 crc kubenswrapper[4909]: E1201 10:49:00.778423 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af\": container with ID starting with fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af not found: ID does not exist" containerID="fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.778472 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af"} err="failed to get container status \"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af\": rpc error: code = NotFound desc = could not find container \"fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af\": container with ID starting with fb3ee38361b3d9de607d9ccd314ce580d56be4ce958e3ae912e7acffa6a6b1af not found: ID does not exist" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849724 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849785 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849834 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9ws\" (UniqueName: \"kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849857 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.849928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.852556 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.854947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.857430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.857461 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.857751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.861305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:00 crc kubenswrapper[4909]: I1201 10:49:00.881398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9ws\" (UniqueName: \"kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws\") pod \"ceilometer-0\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " pod="openstack/ceilometer-0" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.000458 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-975kr" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.046457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.053750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs6nd\" (UniqueName: \"kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd\") pod \"8cd6cd1c-99b8-4639-828e-9585790b9d26\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.053864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data\") pod \"8cd6cd1c-99b8-4639-828e-9585790b9d26\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.053943 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle\") pod \"8cd6cd1c-99b8-4639-828e-9585790b9d26\" (UID: \"8cd6cd1c-99b8-4639-828e-9585790b9d26\") " Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.057323 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8cd6cd1c-99b8-4639-828e-9585790b9d26" (UID: "8cd6cd1c-99b8-4639-828e-9585790b9d26"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.059284 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd" (OuterVolumeSpecName: "kube-api-access-bs6nd") pod "8cd6cd1c-99b8-4639-828e-9585790b9d26" (UID: "8cd6cd1c-99b8-4639-828e-9585790b9d26"). InnerVolumeSpecName "kube-api-access-bs6nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.088406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cd6cd1c-99b8-4639-828e-9585790b9d26" (UID: "8cd6cd1c-99b8-4639-828e-9585790b9d26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.160004 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs6nd\" (UniqueName: \"kubernetes.io/projected/8cd6cd1c-99b8-4639-828e-9585790b9d26-kube-api-access-bs6nd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.160776 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.160799 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd6cd1c-99b8-4639-828e-9585790b9d26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.267810 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce831ec4-3f79-4a97-856b-00438d195fac" path="/var/lib/kubelet/pods/ce831ec4-3f79-4a97-856b-00438d195fac/volumes" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.497818 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.658422 4909 generic.go:334] "Generic (PLEG): container finished" podID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" containerID="0eb9e3c4665fd1d6854ac6b37fbe4826ac4563d9e160e20e11d8b7c643d33471" exitCode=0 Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.658492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6nt8w" event={"ID":"00cc5cc5-c22b-4523-96c5-baa507ad0ce1","Type":"ContainerDied","Data":"0eb9e3c4665fd1d6854ac6b37fbe4826ac4563d9e160e20e11d8b7c643d33471"} Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.660188 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-975kr" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.660186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-975kr" event={"ID":"8cd6cd1c-99b8-4639-828e-9585790b9d26","Type":"ContainerDied","Data":"96d53d6877496df812dcd145ec478ebf26d708a59595c121f932e9eec03dfa23"} Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.660336 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d53d6877496df812dcd145ec478ebf26d708a59595c121f932e9eec03dfa23" Dec 01 10:49:01 crc kubenswrapper[4909]: I1201 10:49:01.663080 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerStarted","Data":"fed63f7609ab25c94a2d1f059d9919cdf58c1c1ee799fa908257a2af1660bc6d"} Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.306742 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-779888c757-glwbd"] Dec 01 10:49:02 crc kubenswrapper[4909]: E1201 10:49:02.307696 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" containerName="barbican-db-sync" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.307714 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" containerName="barbican-db-sync" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.307929 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" containerName="barbican-db-sync" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.309003 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.325825 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.326125 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dt6l6" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.326421 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.346473 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-779888c757-glwbd"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.402941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-combined-ca-bundle\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.403024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.403065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-logs\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.403126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data-custom\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.403163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtjbf\" (UniqueName: \"kubernetes.io/projected/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-kube-api-access-qtjbf\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.409339 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-66c459b97d-lt964"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.411468 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.420590 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.457844 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66c459b97d-lt964"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.496561 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.498588 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.504927 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr26q\" (UniqueName: \"kubernetes.io/projected/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-kube-api-access-jr26q\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505413 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-logs\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505466 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-logs\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data-custom\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-combined-ca-bundle\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data-custom\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtjbf\" (UniqueName: \"kubernetes.io/projected/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-kube-api-access-qtjbf\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.505681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-combined-ca-bundle\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.506430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-logs\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.526438 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data-custom\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.526494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-config-data\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.526605 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-combined-ca-bundle\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.537611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtjbf\" (UniqueName: \"kubernetes.io/projected/a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e-kube-api-access-qtjbf\") pod \"barbican-worker-779888c757-glwbd\" (UID: \"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e\") " pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.607942 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr26q\" (UniqueName: \"kubernetes.io/projected/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-kube-api-access-jr26q\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-logs\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608464 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-combined-ca-bundle\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data-custom\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.608998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.609065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmkzx\" (UniqueName: \"kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.609471 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-logs\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.615385 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-combined-ca-bundle\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.615716 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.624697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-config-data-custom\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.627635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr26q\" (UniqueName: \"kubernetes.io/projected/b0fb7293-36bc-4a84-b9a7-9eec7a62367b-kube-api-access-jr26q\") pod \"barbican-keystone-listener-66c459b97d-lt964\" (UID: \"b0fb7293-36bc-4a84-b9a7-9eec7a62367b\") " pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.668277 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.670924 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.673944 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.680837 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerStarted","Data":"e40e70efd570d4c70827ada2d6d3936048ae8b9c8b8699790adaef2174acc49b"} Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.695563 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.711831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.711900 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.711975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmkzx\" (UniqueName: \"kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cf4\" (UniqueName: \"kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712113 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.712234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.713550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.713567 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.714239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.714991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.736116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmkzx\" (UniqueName: \"kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx\") pod \"dnsmasq-dns-6bb684768f-c6sch\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.750750 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-779888c757-glwbd" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.786434 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.813947 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.814143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.814166 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4cf4\" (UniqueName: \"kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.814182 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.814228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.816247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.824541 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.824762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.826945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.834409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4cf4\" (UniqueName: \"kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4\") pod \"barbican-api-7db6f97b98-lxdjg\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:02 crc kubenswrapper[4909]: I1201 10:49:02.929244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.000969 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.049109 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.118843 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.118930 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.118970 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.119091 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.119177 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh9sz\" (UniqueName: \"kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.119198 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.119266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts\") pod \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\" (UID: \"00cc5cc5-c22b-4523-96c5-baa507ad0ce1\") " Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.119633 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.125169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts" (OuterVolumeSpecName: "scripts") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.125174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz" (OuterVolumeSpecName: "kube-api-access-wh9sz") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "kube-api-access-wh9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.126584 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.162732 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.207999 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data" (OuterVolumeSpecName: "config-data") pod "00cc5cc5-c22b-4523-96c5-baa507ad0ce1" (UID: "00cc5cc5-c22b-4523-96c5-baa507ad0ce1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.221347 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.221386 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.221396 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.221407 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh9sz\" (UniqueName: \"kubernetes.io/projected/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-kube-api-access-wh9sz\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.221422 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cc5cc5-c22b-4523-96c5-baa507ad0ce1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.293012 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66c459b97d-lt964"] Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.426441 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-779888c757-glwbd"] Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.509638 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.667100 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.706547 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerStarted","Data":"32a8aa49f1ad593334f062df2f7f8aad1cacaa47733a54774208e364c57632c2"} Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.709013 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6nt8w" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.715324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6nt8w" event={"ID":"00cc5cc5-c22b-4523-96c5-baa507ad0ce1","Type":"ContainerDied","Data":"d2188bca7aad3341e2f12fa446084138f4da332dbc390011c74225c568d86a68"} Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.715426 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2188bca7aad3341e2f12fa446084138f4da332dbc390011c74225c568d86a68" Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.715450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" event={"ID":"d00ee9c3-9b08-450f-8610-dd151437b1ec","Type":"ContainerStarted","Data":"304c22f6a77335cf7af65a65cc536fad80cf63babf673279c26c6c1203f776b7"} Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.715470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-779888c757-glwbd" event={"ID":"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e","Type":"ContainerStarted","Data":"be43108cc048b99e97ed2644c6017c803874991f215a11489fdffb3f87c72a49"} Dec 01 10:49:03 crc kubenswrapper[4909]: I1201 10:49:03.715859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" event={"ID":"b0fb7293-36bc-4a84-b9a7-9eec7a62367b","Type":"ContainerStarted","Data":"238d1e62cd4220ae1a69c29a9f3c4b121ae5de5d343cab89c958f3c09bec7054"} Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.067293 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:04 crc kubenswrapper[4909]: E1201 10:49:04.067838 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" containerName="cinder-db-sync" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.067855 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" containerName="cinder-db-sync" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.068139 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" containerName="cinder-db-sync" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.069473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.073591 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.073845 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.076046 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.080656 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mgtq4" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.083361 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.110509 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.142835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.142911 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.143033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.143091 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.143106 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.143135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxn2\" (UniqueName: \"kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.160581 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.162227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.178147 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245807 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245842 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knfw\" (UniqueName: \"kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxn2\" (UniqueName: \"kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.245977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.246017 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.246038 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.246057 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.246107 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.246465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.250778 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.252110 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.256081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.260593 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.283932 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.285988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.293714 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxn2\" (UniqueName: \"kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2\") pod \"cinder-scheduler-0\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.295308 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.331628 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.347730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.347913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.347958 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzfv\" (UniqueName: \"kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348042 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348069 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348164 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348230 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knfw\" (UniqueName: \"kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.348290 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.350607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.351968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.354176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.356630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.376502 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knfw\" (UniqueName: \"kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw\") pod \"dnsmasq-dns-6d97fcdd8f-wsc9p\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.433493 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.449760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzfv\" (UniqueName: \"kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.449834 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.449921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.449970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.450031 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.450094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.450133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.450525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.450912 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.455567 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.460119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.464129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.469351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.469696 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzfv\" (UniqueName: \"kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv\") pod \"cinder-api-0\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.634401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.664099 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.733375 4909 generic.go:334] "Generic (PLEG): container finished" podID="d00ee9c3-9b08-450f-8610-dd151437b1ec" containerID="ffe7af8c7236e2501ef444cc9ad92743165982806d23cea72589b5c6de8837df" exitCode=0 Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.733421 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" event={"ID":"d00ee9c3-9b08-450f-8610-dd151437b1ec","Type":"ContainerDied","Data":"ffe7af8c7236e2501ef444cc9ad92743165982806d23cea72589b5c6de8837df"} Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.748071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerStarted","Data":"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca"} Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.748128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerStarted","Data":"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06"} Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.748140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerStarted","Data":"1a71870a0d2cad3f8611ff995b3ae617be91f4c67a627f4db0a6b796aca40003"} Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.749212 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:04 crc kubenswrapper[4909]: I1201 10:49:04.749243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:05 crc kubenswrapper[4909]: E1201 10:49:05.180715 4909 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 10:49:05 crc kubenswrapper[4909]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d00ee9c3-9b08-450f-8610-dd151437b1ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 10:49:05 crc kubenswrapper[4909]: > podSandboxID="304c22f6a77335cf7af65a65cc536fad80cf63babf673279c26c6c1203f776b7" Dec 01 10:49:05 crc kubenswrapper[4909]: E1201 10:49:05.181444 4909 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 10:49:05 crc kubenswrapper[4909]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66bh5ddhdch665h558hcfh59chd6h579h5f4h75h67h5bh556hbh66fh574h5d8hf5hcch64ch649h5fdh86h85h5f8h699h64bh5c9h54ch68bh5c5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmkzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6bb684768f-c6sch_openstack(d00ee9c3-9b08-450f-8610-dd151437b1ec): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d00ee9c3-9b08-450f-8610-dd151437b1ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 10:49:05 crc kubenswrapper[4909]: > logger="UnhandledError" Dec 01 10:49:05 crc kubenswrapper[4909]: E1201 10:49:05.182934 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d00ee9c3-9b08-450f-8610-dd151437b1ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" podUID="d00ee9c3-9b08-450f-8610-dd151437b1ec" Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.211178 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7db6f97b98-lxdjg" podStartSLOduration=3.211163585 podStartE2EDuration="3.211163585s" podCreationTimestamp="2025-12-01 10:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:04.799604849 +0000 UTC m=+1062.034075757" watchObservedRunningTime="2025-12-01 10:49:05.211163585 +0000 UTC m=+1062.445634483" Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.221590 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.344340 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:49:05 crc kubenswrapper[4909]: W1201 10:49:05.364494 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06f87308_e13d_4bc3_89ed_c0ec275c4824.slice/crio-86ef8eedcbeaf05b9bdb60e6285528f6ce6275052823f8e7d15ed5a9952a9cec WatchSource:0}: Error finding container 86ef8eedcbeaf05b9bdb60e6285528f6ce6275052823f8e7d15ed5a9952a9cec: Status 404 returned error can't find the container with id 86ef8eedcbeaf05b9bdb60e6285528f6ce6275052823f8e7d15ed5a9952a9cec Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.473420 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.759527 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerStarted","Data":"4513a9d6a54da49b6d025371f9772f79a2d311d995351e32547630b19a061d06"} Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.762623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerStarted","Data":"fe93d81b91ca9d9e62e11270385a1c14f38da2c4b38973c08dabea2b8c463377"} Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.765676 4909 generic.go:334] "Generic (PLEG): container finished" podID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerID="9f1edb7681b02a45797ae6659b76d203e94036dd3a98c266bbf849e792ab68d8" exitCode=0 Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.766043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" event={"ID":"06f87308-e13d-4bc3-89ed-c0ec275c4824","Type":"ContainerDied","Data":"9f1edb7681b02a45797ae6659b76d203e94036dd3a98c266bbf849e792ab68d8"} Dec 01 10:49:05 crc kubenswrapper[4909]: I1201 10:49:05.766082 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" event={"ID":"06f87308-e13d-4bc3-89ed-c0ec275c4824","Type":"ContainerStarted","Data":"86ef8eedcbeaf05b9bdb60e6285528f6ce6275052823f8e7d15ed5a9952a9cec"} Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.611786 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.792392 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerStarted","Data":"fe15492aa43bd376aa7c3b161217cd706236e86a02e583ec72f54016b95db5ce"} Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.795935 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.795979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-c6sch" event={"ID":"d00ee9c3-9b08-450f-8610-dd151437b1ec","Type":"ContainerDied","Data":"304c22f6a77335cf7af65a65cc536fad80cf63babf673279c26c6c1203f776b7"} Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.796034 4909 scope.go:117] "RemoveContainer" containerID="ffe7af8c7236e2501ef444cc9ad92743165982806d23cea72589b5c6de8837df" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.813299 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb\") pod \"d00ee9c3-9b08-450f-8610-dd151437b1ec\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.813649 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config\") pod \"d00ee9c3-9b08-450f-8610-dd151437b1ec\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.813699 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmkzx\" (UniqueName: \"kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx\") pod \"d00ee9c3-9b08-450f-8610-dd151437b1ec\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.813755 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb\") pod \"d00ee9c3-9b08-450f-8610-dd151437b1ec\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.813776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc\") pod \"d00ee9c3-9b08-450f-8610-dd151437b1ec\" (UID: \"d00ee9c3-9b08-450f-8610-dd151437b1ec\") " Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.828635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx" (OuterVolumeSpecName: "kube-api-access-zmkzx") pod "d00ee9c3-9b08-450f-8610-dd151437b1ec" (UID: "d00ee9c3-9b08-450f-8610-dd151437b1ec"). InnerVolumeSpecName "kube-api-access-zmkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.920841 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmkzx\" (UniqueName: \"kubernetes.io/projected/d00ee9c3-9b08-450f-8610-dd151437b1ec-kube-api-access-zmkzx\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.986918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d00ee9c3-9b08-450f-8610-dd151437b1ec" (UID: "d00ee9c3-9b08-450f-8610-dd151437b1ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.988267 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config" (OuterVolumeSpecName: "config") pod "d00ee9c3-9b08-450f-8610-dd151437b1ec" (UID: "d00ee9c3-9b08-450f-8610-dd151437b1ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.990206 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d00ee9c3-9b08-450f-8610-dd151437b1ec" (UID: "d00ee9c3-9b08-450f-8610-dd151437b1ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:06 crc kubenswrapper[4909]: I1201 10:49:06.991597 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d00ee9c3-9b08-450f-8610-dd151437b1ec" (UID: "d00ee9c3-9b08-450f-8610-dd151437b1ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.022603 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.022636 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.022646 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.022660 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d00ee9c3-9b08-450f-8610-dd151437b1ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.168985 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.190541 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-c6sch"] Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.289940 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00ee9c3-9b08-450f-8610-dd151437b1ec" path="/var/lib/kubelet/pods/d00ee9c3-9b08-450f-8610-dd151437b1ec/volumes" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.370413 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.853464 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-779888c757-glwbd" event={"ID":"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e","Type":"ContainerStarted","Data":"03fc8811945a276fcafc68e23728612f27526bb7d3608d8356081a8636fd3337"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.853900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-779888c757-glwbd" event={"ID":"a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e","Type":"ContainerStarted","Data":"b5a82bc84414da6508aae46151ecc11cd9b3da84f3046b991270097830448dba"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.862909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerStarted","Data":"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.867673 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" event={"ID":"b0fb7293-36bc-4a84-b9a7-9eec7a62367b","Type":"ContainerStarted","Data":"43cfb8776ca33e9a41582c13441324809ae4d727ed0f70a91dd7d95a27d886a2"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.867723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" event={"ID":"b0fb7293-36bc-4a84-b9a7-9eec7a62367b","Type":"ContainerStarted","Data":"a662bebfa3aaa06cdbfa7cff33cec55868d9846ba9172ea16c2bd1be6723156a"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.874543 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerStarted","Data":"014d158da905d1fc5df65a2ea337e84fd6edfd9581eccce41205224a9c4bd394"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.875941 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.878544 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-779888c757-glwbd" podStartSLOduration=2.8496996709999998 podStartE2EDuration="5.878526583s" podCreationTimestamp="2025-12-01 10:49:02 +0000 UTC" firstStartedPulling="2025-12-01 10:49:03.424264198 +0000 UTC m=+1060.658735096" lastFinishedPulling="2025-12-01 10:49:06.45309111 +0000 UTC m=+1063.687562008" observedRunningTime="2025-12-01 10:49:07.876981705 +0000 UTC m=+1065.111452603" watchObservedRunningTime="2025-12-01 10:49:07.878526583 +0000 UTC m=+1065.112997491" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.889459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerStarted","Data":"e7d929b88a2332a891bda3385d79c8a4ebd973860453e5ca1b7ff73aa97a9b6d"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.899691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" event={"ID":"06f87308-e13d-4bc3-89ed-c0ec275c4824","Type":"ContainerStarted","Data":"aad1d773964d01a8539bc400576046879713b710f8f394e8c490217d5f3eb00d"} Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.900861 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.911789 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-66c459b97d-lt964" podStartSLOduration=2.7631240630000002 podStartE2EDuration="5.911764599s" podCreationTimestamp="2025-12-01 10:49:02 +0000 UTC" firstStartedPulling="2025-12-01 10:49:03.301993877 +0000 UTC m=+1060.536464775" lastFinishedPulling="2025-12-01 10:49:06.450634413 +0000 UTC m=+1063.685105311" observedRunningTime="2025-12-01 10:49:07.900461047 +0000 UTC m=+1065.134931955" watchObservedRunningTime="2025-12-01 10:49:07.911764599 +0000 UTC m=+1065.146235497" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.960245 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" podStartSLOduration=3.9602212789999998 podStartE2EDuration="3.960221279s" podCreationTimestamp="2025-12-01 10:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:07.953765388 +0000 UTC m=+1065.188236316" watchObservedRunningTime="2025-12-01 10:49:07.960221279 +0000 UTC m=+1065.194692177" Dec 01 10:49:07 crc kubenswrapper[4909]: I1201 10:49:07.965300 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.972625694 podStartE2EDuration="7.965279277s" podCreationTimestamp="2025-12-01 10:49:00 +0000 UTC" firstStartedPulling="2025-12-01 10:49:01.505243833 +0000 UTC m=+1058.739714731" lastFinishedPulling="2025-12-01 10:49:06.497897406 +0000 UTC m=+1063.732368314" observedRunningTime="2025-12-01 10:49:07.930204883 +0000 UTC m=+1065.164675801" watchObservedRunningTime="2025-12-01 10:49:07.965279277 +0000 UTC m=+1065.199750195" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.809242 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-744d7cd6db-d2s8r"] Dec 01 10:49:08 crc kubenswrapper[4909]: E1201 10:49:08.810295 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00ee9c3-9b08-450f-8610-dd151437b1ec" containerName="init" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.810325 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00ee9c3-9b08-450f-8610-dd151437b1ec" containerName="init" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.812149 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00ee9c3-9b08-450f-8610-dd151437b1ec" containerName="init" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.820722 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.831075 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.831084 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.834445 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744d7cd6db-d2s8r"] Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.928573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerStarted","Data":"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c"} Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.934452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerStarted","Data":"ae29488b26cf736be146aa4a9c510b1d41815613b09aa6334ce501408a06929a"} Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.935316 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api-log" containerID="cri-o://e7d929b88a2332a891bda3385d79c8a4ebd973860453e5ca1b7ff73aa97a9b6d" gracePeriod=30 Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.935451 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api" containerID="cri-o://ae29488b26cf736be146aa4a9c510b1d41815613b09aa6334ce501408a06929a" gracePeriod=30 Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data-custom\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963527 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d56ac4f-6def-4711-bc77-98ed18152626-logs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963611 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-public-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-internal-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-combined-ca-bundle\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963717 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh657\" (UniqueName: \"kubernetes.io/projected/3d56ac4f-6def-4711-bc77-98ed18152626-kube-api-access-bh657\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.963760 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.964035 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4943315200000002 podStartE2EDuration="4.964014452s" podCreationTimestamp="2025-12-01 10:49:04 +0000 UTC" firstStartedPulling="2025-12-01 10:49:05.266368986 +0000 UTC m=+1062.500839884" lastFinishedPulling="2025-12-01 10:49:06.736051918 +0000 UTC m=+1063.970522816" observedRunningTime="2025-12-01 10:49:08.956160537 +0000 UTC m=+1066.190631445" watchObservedRunningTime="2025-12-01 10:49:08.964014452 +0000 UTC m=+1066.198485360" Dec 01 10:49:08 crc kubenswrapper[4909]: I1201 10:49:08.976054 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.976036747 podStartE2EDuration="4.976036747s" podCreationTimestamp="2025-12-01 10:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:08.974618442 +0000 UTC m=+1066.209089360" watchObservedRunningTime="2025-12-01 10:49:08.976036747 +0000 UTC m=+1066.210507655" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.065239 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.065613 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data-custom\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.066946 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d56ac4f-6def-4711-bc77-98ed18152626-logs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.068424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d56ac4f-6def-4711-bc77-98ed18152626-logs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.072790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.073621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-config-data-custom\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.077132 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-public-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.077212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-internal-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.077320 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-combined-ca-bundle\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.077508 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh657\" (UniqueName: \"kubernetes.io/projected/3d56ac4f-6def-4711-bc77-98ed18152626-kube-api-access-bh657\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.080444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-public-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.085437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-internal-tls-certs\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.089273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d56ac4f-6def-4711-bc77-98ed18152626-combined-ca-bundle\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.113510 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh657\" (UniqueName: \"kubernetes.io/projected/3d56ac4f-6def-4711-bc77-98ed18152626-kube-api-access-bh657\") pod \"barbican-api-744d7cd6db-d2s8r\" (UID: \"3d56ac4f-6def-4711-bc77-98ed18152626\") " pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.153739 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.434994 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.664840 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.696952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744d7cd6db-d2s8r"] Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.958043 4909 generic.go:334] "Generic (PLEG): container finished" podID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerID="e7d929b88a2332a891bda3385d79c8a4ebd973860453e5ca1b7ff73aa97a9b6d" exitCode=143 Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.958148 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerDied","Data":"e7d929b88a2332a891bda3385d79c8a4ebd973860453e5ca1b7ff73aa97a9b6d"} Dec 01 10:49:09 crc kubenswrapper[4909]: I1201 10:49:09.960712 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d7cd6db-d2s8r" event={"ID":"3d56ac4f-6def-4711-bc77-98ed18152626","Type":"ContainerStarted","Data":"c5509c467292ce5be259e150c55979d95be9f1869afc2bcd86eb46e7e21f3cb1"} Dec 01 10:49:10 crc kubenswrapper[4909]: I1201 10:49:10.245272 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:10 crc kubenswrapper[4909]: I1201 10:49:10.974092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d7cd6db-d2s8r" event={"ID":"3d56ac4f-6def-4711-bc77-98ed18152626","Type":"ContainerStarted","Data":"cab9ea9298461db21a3596c8bd66e5a9c00a831904fd3b44dd13947171e5fb20"} Dec 01 10:49:10 crc kubenswrapper[4909]: I1201 10:49:10.975730 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:10 crc kubenswrapper[4909]: I1201 10:49:10.976047 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:10 crc kubenswrapper[4909]: I1201 10:49:10.976176 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d7cd6db-d2s8r" event={"ID":"3d56ac4f-6def-4711-bc77-98ed18152626","Type":"ContainerStarted","Data":"98d09bf7410b43983cbd4ea4710a8a56328c2e1c78eec9c41d1722ac1d1c326a"} Dec 01 10:49:11 crc kubenswrapper[4909]: I1201 10:49:11.000706 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-744d7cd6db-d2s8r" podStartSLOduration=3.000678042 podStartE2EDuration="3.000678042s" podCreationTimestamp="2025-12-01 10:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:10.994403708 +0000 UTC m=+1068.228874616" watchObservedRunningTime="2025-12-01 10:49:11.000678042 +0000 UTC m=+1068.235148950" Dec 01 10:49:11 crc kubenswrapper[4909]: I1201 10:49:11.892301 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:12 crc kubenswrapper[4909]: I1201 10:49:12.294942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.636980 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.712540 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.712848 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-hv278" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="dnsmasq-dns" containerID="cri-o://71f492ea1b84428a90c559b3f468226616e84252068deda39c98d1c21e8972cb" gracePeriod=10 Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.782136 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.887727 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:14 crc kubenswrapper[4909]: I1201 10:49:14.960046 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68f697f85-549sp" Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.023958 4909 generic.go:334] "Generic (PLEG): container finished" podID="53da20b2-7c45-49d6-9964-4b495bfea701" containerID="71f492ea1b84428a90c559b3f468226616e84252068deda39c98d1c21e8972cb" exitCode=0 Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.024058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hv278" event={"ID":"53da20b2-7c45-49d6-9964-4b495bfea701","Type":"ContainerDied","Data":"71f492ea1b84428a90c559b3f468226616e84252068deda39c98d1c21e8972cb"} Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.024188 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="cinder-scheduler" containerID="cri-o://e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f" gracePeriod=30 Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.024614 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="probe" containerID="cri-o://f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c" gracePeriod=30 Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.036375 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.036631 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd476784b-fck6x" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-api" containerID="cri-o://219c64bf1be97806f47c2029fd3211b3edd3a7b777d1e2c3e365744010130179" gracePeriod=30 Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.036805 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bd476784b-fck6x" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-httpd" containerID="cri-o://400c05162b9957e073998220e0a3387fb1f55f81e54c5e184da6075d48e2b8d2" gracePeriod=30 Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.115060 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.513915 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7548d5fdbb-tbnwp" Dec 01 10:49:15 crc kubenswrapper[4909]: I1201 10:49:15.964287 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.043944 4909 generic.go:334] "Generic (PLEG): container finished" podID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerID="400c05162b9957e073998220e0a3387fb1f55f81e54c5e184da6075d48e2b8d2" exitCode=0 Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.044011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerDied","Data":"400c05162b9957e073998220e0a3387fb1f55f81e54c5e184da6075d48e2b8d2"} Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.046769 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hv278" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.047109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hv278" event={"ID":"53da20b2-7c45-49d6-9964-4b495bfea701","Type":"ContainerDied","Data":"1af850c4ac5c51eb6123be269d37cd8009f8028804b4729ac76cca18d1591c2a"} Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.047147 4909 scope.go:117] "RemoveContainer" containerID="71f492ea1b84428a90c559b3f468226616e84252068deda39c98d1c21e8972cb" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.123458 4909 scope.go:117] "RemoveContainer" containerID="654d40d814da071fafd37efd00e69610a2539ba7a14ee90a7615a5f5c55d3eb9" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.153836 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw4hp\" (UniqueName: \"kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp\") pod \"53da20b2-7c45-49d6-9964-4b495bfea701\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.153915 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc\") pod \"53da20b2-7c45-49d6-9964-4b495bfea701\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.153939 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb\") pod \"53da20b2-7c45-49d6-9964-4b495bfea701\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.154104 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb\") pod \"53da20b2-7c45-49d6-9964-4b495bfea701\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.154223 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config\") pod \"53da20b2-7c45-49d6-9964-4b495bfea701\" (UID: \"53da20b2-7c45-49d6-9964-4b495bfea701\") " Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.166009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp" (OuterVolumeSpecName: "kube-api-access-gw4hp") pod "53da20b2-7c45-49d6-9964-4b495bfea701" (UID: "53da20b2-7c45-49d6-9964-4b495bfea701"). InnerVolumeSpecName "kube-api-access-gw4hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.205930 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53da20b2-7c45-49d6-9964-4b495bfea701" (UID: "53da20b2-7c45-49d6-9964-4b495bfea701"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.207784 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53da20b2-7c45-49d6-9964-4b495bfea701" (UID: "53da20b2-7c45-49d6-9964-4b495bfea701"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.221859 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.222438 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53da20b2-7c45-49d6-9964-4b495bfea701" (UID: "53da20b2-7c45-49d6-9964-4b495bfea701"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.223032 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config" (OuterVolumeSpecName: "config") pod "53da20b2-7c45-49d6-9964-4b495bfea701" (UID: "53da20b2-7c45-49d6-9964-4b495bfea701"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.257065 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.257142 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.257408 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw4hp\" (UniqueName: \"kubernetes.io/projected/53da20b2-7c45-49d6-9964-4b495bfea701-kube-api-access-gw4hp\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.257583 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.257696 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53da20b2-7c45-49d6-9964-4b495bfea701-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.381961 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:49:16 crc kubenswrapper[4909]: I1201 10:49:16.389911 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hv278"] Dec 01 10:49:17 crc kubenswrapper[4909]: I1201 10:49:17.096936 4909 generic.go:334] "Generic (PLEG): container finished" podID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerID="f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c" exitCode=0 Dec 01 10:49:17 crc kubenswrapper[4909]: I1201 10:49:17.096987 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerDied","Data":"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c"} Dec 01 10:49:17 crc kubenswrapper[4909]: I1201 10:49:17.268105 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" path="/var/lib/kubelet/pods/53da20b2-7c45-49d6-9964-4b495bfea701/volumes" Dec 01 10:49:17 crc kubenswrapper[4909]: I1201 10:49:17.341586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 10:49:17 crc kubenswrapper[4909]: I1201 10:49:17.968037 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744d7cd6db-d2s8r" Dec 01 10:49:18 crc kubenswrapper[4909]: I1201 10:49:18.023381 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:18 crc kubenswrapper[4909]: I1201 10:49:18.023641 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7db6f97b98-lxdjg" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api-log" containerID="cri-o://0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06" gracePeriod=30 Dec 01 10:49:18 crc kubenswrapper[4909]: I1201 10:49:18.024594 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7db6f97b98-lxdjg" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api" containerID="cri-o://ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca" gracePeriod=30 Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.120291 4909 generic.go:334] "Generic (PLEG): container finished" podID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerID="219c64bf1be97806f47c2029fd3211b3edd3a7b777d1e2c3e365744010130179" exitCode=0 Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.120567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerDied","Data":"219c64bf1be97806f47c2029fd3211b3edd3a7b777d1e2c3e365744010130179"} Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.122813 4909 generic.go:334] "Generic (PLEG): container finished" podID="52e25265-f246-4650-8657-a22c7ee2cb12" containerID="0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06" exitCode=143 Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.122838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerDied","Data":"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06"} Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.609297 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.718817 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.729933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.730041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.730088 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcxn2\" (UniqueName: \"kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.730147 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.730179 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.730208 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle\") pod \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\" (UID: \"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.731540 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.739228 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.739530 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts" (OuterVolumeSpecName: "scripts") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.745269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2" (OuterVolumeSpecName: "kube-api-access-kcxn2") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "kube-api-access-kcxn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.825970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.836682 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjth7\" (UniqueName: \"kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7\") pod \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.836770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config\") pod \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.836819 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config\") pod \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837073 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle\") pod \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837181 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs\") pod \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\" (UID: \"449c0d76-7d5d-4817-8f78-074ae4cb1cfa\") " Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837733 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837768 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcxn2\" (UniqueName: \"kubernetes.io/projected/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-kube-api-access-kcxn2\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837781 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837793 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.837803 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.840805 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "449c0d76-7d5d-4817-8f78-074ae4cb1cfa" (UID: "449c0d76-7d5d-4817-8f78-074ae4cb1cfa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.844082 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7" (OuterVolumeSpecName: "kube-api-access-bjth7") pod "449c0d76-7d5d-4817-8f78-074ae4cb1cfa" (UID: "449c0d76-7d5d-4817-8f78-074ae4cb1cfa"). InnerVolumeSpecName "kube-api-access-bjth7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.884104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data" (OuterVolumeSpecName: "config-data") pod "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" (UID: "8ccb21c4-ab30-4230-bd5c-d1da9ef2fece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.887922 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "449c0d76-7d5d-4817-8f78-074ae4cb1cfa" (UID: "449c0d76-7d5d-4817-8f78-074ae4cb1cfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.903781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config" (OuterVolumeSpecName: "config") pod "449c0d76-7d5d-4817-8f78-074ae4cb1cfa" (UID: "449c0d76-7d5d-4817-8f78-074ae4cb1cfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.919030 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "449c0d76-7d5d-4817-8f78-074ae4cb1cfa" (UID: "449c0d76-7d5d-4817-8f78-074ae4cb1cfa"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938699 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938733 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938743 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938752 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjth7\" (UniqueName: \"kubernetes.io/projected/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-kube-api-access-bjth7\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938764 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:19 crc kubenswrapper[4909]: I1201 10:49:19.938775 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/449c0d76-7d5d-4817-8f78-074ae4cb1cfa-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.123612 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-58dc5cfbbd-v7pkq" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.136081 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bd476784b-fck6x" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.136075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bd476784b-fck6x" event={"ID":"449c0d76-7d5d-4817-8f78-074ae4cb1cfa","Type":"ContainerDied","Data":"07f45b2c884753c4116383e26e5f00ac677261bb1df3fb775ebe1e32c84dd98c"} Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.136252 4909 scope.go:117] "RemoveContainer" containerID="400c05162b9957e073998220e0a3387fb1f55f81e54c5e184da6075d48e2b8d2" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.138632 4909 generic.go:334] "Generic (PLEG): container finished" podID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerID="e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f" exitCode=0 Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.138894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerDied","Data":"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f"} Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.138999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ccb21c4-ab30-4230-bd5c-d1da9ef2fece","Type":"ContainerDied","Data":"4513a9d6a54da49b6d025371f9772f79a2d311d995351e32547630b19a061d06"} Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.139114 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.175794 4909 scope.go:117] "RemoveContainer" containerID="219c64bf1be97806f47c2029fd3211b3edd3a7b777d1e2c3e365744010130179" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.217492 4909 scope.go:117] "RemoveContainer" containerID="f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.217677 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.242020 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bd476784b-fck6x"] Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.261931 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.273338 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.286774 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287236 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-httpd" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287254 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-httpd" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287273 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="dnsmasq-dns" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="dnsmasq-dns" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287302 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="init" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287310 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="init" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287318 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-api" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287324 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-api" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287335 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="cinder-scheduler" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287342 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="cinder-scheduler" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.287356 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="probe" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287369 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="probe" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287562 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="cinder-scheduler" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287579 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-httpd" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287590 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" containerName="probe" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287598 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" containerName="neutron-api" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.287612 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="53da20b2-7c45-49d6-9964-4b495bfea701" containerName="dnsmasq-dns" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.288574 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.290341 4909 scope.go:117] "RemoveContainer" containerID="e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.291275 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.297592 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.325472 4909 scope.go:117] "RemoveContainer" containerID="f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.333056 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c\": container with ID starting with f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c not found: ID does not exist" containerID="f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.333123 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c"} err="failed to get container status \"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c\": rpc error: code = NotFound desc = could not find container \"f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c\": container with ID starting with f24fd0a8bdfe45b49609318275b97598a0d3bd97b759b526d82020c54020728c not found: ID does not exist" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.333161 4909 scope.go:117] "RemoveContainer" containerID="e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f" Dec 01 10:49:20 crc kubenswrapper[4909]: E1201 10:49:20.336363 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f\": container with ID starting with e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f not found: ID does not exist" containerID="e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.336413 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f"} err="failed to get container status \"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f\": rpc error: code = NotFound desc = could not find container \"e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f\": container with ID starting with e2d43170abe47235e449324b5a686345f0db542a58177f54cacd5e503939dd4f not found: ID does not exist" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.348756 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.348899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.348975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da6d9185-171d-4197-9265-4252b98166e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.349023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.349107 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tncz\" (UniqueName: \"kubernetes.io/projected/da6d9185-171d-4197-9265-4252b98166e7-kube-api-access-4tncz\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.349214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.449788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450248 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da6d9185-171d-4197-9265-4252b98166e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tncz\" (UniqueName: \"kubernetes.io/projected/da6d9185-171d-4197-9265-4252b98166e7-kube-api-access-4tncz\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.450532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da6d9185-171d-4197-9265-4252b98166e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.455607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.456821 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.458521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.458721 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6d9185-171d-4197-9265-4252b98166e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.467454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tncz\" (UniqueName: \"kubernetes.io/projected/da6d9185-171d-4197-9265-4252b98166e7-kube-api-access-4tncz\") pod \"cinder-scheduler-0\" (UID: \"da6d9185-171d-4197-9265-4252b98166e7\") " pod="openstack/cinder-scheduler-0" Dec 01 10:49:20 crc kubenswrapper[4909]: I1201 10:49:20.609659 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.076061 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.152824 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da6d9185-171d-4197-9265-4252b98166e7","Type":"ContainerStarted","Data":"be3a5eac3a0ae6d199091a5af53b418ac4b3b078591e7afc41b60894fc90831b"} Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.274469 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449c0d76-7d5d-4817-8f78-074ae4cb1cfa" path="/var/lib/kubelet/pods/449c0d76-7d5d-4817-8f78-074ae4cb1cfa/volumes" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.275749 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccb21c4-ab30-4230-bd5c-d1da9ef2fece" path="/var/lib/kubelet/pods/8ccb21c4-ab30-4230-bd5c-d1da9ef2fece/volumes" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.696108 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.783009 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle\") pod \"52e25265-f246-4650-8657-a22c7ee2cb12\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.783066 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data\") pod \"52e25265-f246-4650-8657-a22c7ee2cb12\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.783115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom\") pod \"52e25265-f246-4650-8657-a22c7ee2cb12\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.783187 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs\") pod \"52e25265-f246-4650-8657-a22c7ee2cb12\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.783216 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4cf4\" (UniqueName: \"kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4\") pod \"52e25265-f246-4650-8657-a22c7ee2cb12\" (UID: \"52e25265-f246-4650-8657-a22c7ee2cb12\") " Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.784680 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs" (OuterVolumeSpecName: "logs") pod "52e25265-f246-4650-8657-a22c7ee2cb12" (UID: "52e25265-f246-4650-8657-a22c7ee2cb12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.790564 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4" (OuterVolumeSpecName: "kube-api-access-s4cf4") pod "52e25265-f246-4650-8657-a22c7ee2cb12" (UID: "52e25265-f246-4650-8657-a22c7ee2cb12"). InnerVolumeSpecName "kube-api-access-s4cf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.795177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52e25265-f246-4650-8657-a22c7ee2cb12" (UID: "52e25265-f246-4650-8657-a22c7ee2cb12"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.816946 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52e25265-f246-4650-8657-a22c7ee2cb12" (UID: "52e25265-f246-4650-8657-a22c7ee2cb12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.872889 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data" (OuterVolumeSpecName: "config-data") pod "52e25265-f246-4650-8657-a22c7ee2cb12" (UID: "52e25265-f246-4650-8657-a22c7ee2cb12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.886004 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.886270 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52e25265-f246-4650-8657-a22c7ee2cb12-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.886284 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4cf4\" (UniqueName: \"kubernetes.io/projected/52e25265-f246-4650-8657-a22c7ee2cb12-kube-api-access-s4cf4\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.886299 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:21 crc kubenswrapper[4909]: I1201 10:49:21.886307 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e25265-f246-4650-8657-a22c7ee2cb12-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.166702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da6d9185-171d-4197-9265-4252b98166e7","Type":"ContainerStarted","Data":"1b3c285454cb14f747b497fd0f471046f9e99054ad1b788f84a149ad25644b59"} Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.169247 4909 generic.go:334] "Generic (PLEG): container finished" podID="52e25265-f246-4650-8657-a22c7ee2cb12" containerID="ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca" exitCode=0 Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.169296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerDied","Data":"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca"} Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.169308 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7db6f97b98-lxdjg" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.169338 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7db6f97b98-lxdjg" event={"ID":"52e25265-f246-4650-8657-a22c7ee2cb12","Type":"ContainerDied","Data":"1a71870a0d2cad3f8611ff995b3ae617be91f4c67a627f4db0a6b796aca40003"} Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.169359 4909 scope.go:117] "RemoveContainer" containerID="ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.216422 4909 scope.go:117] "RemoveContainer" containerID="0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.218389 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.228593 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7db6f97b98-lxdjg"] Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.237330 4909 scope.go:117] "RemoveContainer" containerID="ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca" Dec 01 10:49:22 crc kubenswrapper[4909]: E1201 10:49:22.237822 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca\": container with ID starting with ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca not found: ID does not exist" containerID="ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.237863 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca"} err="failed to get container status \"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca\": rpc error: code = NotFound desc = could not find container \"ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca\": container with ID starting with ca8e76c4a855d47460d3b67878fdce605e2d33e24166d04a726100a7e0bb1bca not found: ID does not exist" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.237909 4909 scope.go:117] "RemoveContainer" containerID="0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06" Dec 01 10:49:22 crc kubenswrapper[4909]: E1201 10:49:22.238288 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06\": container with ID starting with 0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06 not found: ID does not exist" containerID="0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.238394 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06"} err="failed to get container status \"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06\": rpc error: code = NotFound desc = could not find container \"0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06\": container with ID starting with 0e7cd88332f019543d0bb51a0edbe7ee4f562f223b3860af7c247923381f5a06 not found: ID does not exist" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.610109 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 10:49:22 crc kubenswrapper[4909]: E1201 10:49:22.610575 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api-log" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.610593 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api-log" Dec 01 10:49:22 crc kubenswrapper[4909]: E1201 10:49:22.610608 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.610634 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.610810 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api-log" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.610833 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" containerName="barbican-api" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.611538 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.614203 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.614302 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wcpqn" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.615236 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.618824 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.805696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.805768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmbt\" (UniqueName: \"kubernetes.io/projected/09ab22ab-a48e-4e11-b498-9f78812079b1-kube-api-access-kwmbt\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.806456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.806568 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.908765 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmbt\" (UniqueName: \"kubernetes.io/projected/09ab22ab-a48e-4e11-b498-9f78812079b1-kube-api-access-kwmbt\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.908861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.908916 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.908982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.910224 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.921622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.922201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab22ab-a48e-4e11-b498-9f78812079b1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:22 crc kubenswrapper[4909]: I1201 10:49:22.938367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmbt\" (UniqueName: \"kubernetes.io/projected/09ab22ab-a48e-4e11-b498-9f78812079b1-kube-api-access-kwmbt\") pod \"openstackclient\" (UID: \"09ab22ab-a48e-4e11-b498-9f78812079b1\") " pod="openstack/openstackclient" Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.233351 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wcpqn" Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.233393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da6d9185-171d-4197-9265-4252b98166e7","Type":"ContainerStarted","Data":"f1ab2e177c4f6a0d7758792654ba143b72f8fb721861cb0cff689fadde82aeb6"} Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.242696 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.310665 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e25265-f246-4650-8657-a22c7ee2cb12" path="/var/lib/kubelet/pods/52e25265-f246-4650-8657-a22c7ee2cb12/volumes" Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.314524 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.314489165 podStartE2EDuration="3.314489165s" podCreationTimestamp="2025-12-01 10:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:23.301339906 +0000 UTC m=+1080.535810804" watchObservedRunningTime="2025-12-01 10:49:23.314489165 +0000 UTC m=+1080.548960073" Dec 01 10:49:23 crc kubenswrapper[4909]: I1201 10:49:23.892297 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 10:49:24 crc kubenswrapper[4909]: I1201 10:49:24.242580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09ab22ab-a48e-4e11-b498-9f78812079b1","Type":"ContainerStarted","Data":"d380d2c73d8e9c82592b76acabf43bf861984bb95a7497789e95fc6b115e3858"} Dec 01 10:49:25 crc kubenswrapper[4909]: I1201 10:49:25.610574 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 10:49:30 crc kubenswrapper[4909]: I1201 10:49:30.827713 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 10:49:31 crc kubenswrapper[4909]: I1201 10:49:31.056393 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:49:33 crc kubenswrapper[4909]: I1201 10:49:33.172674 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:33 crc kubenswrapper[4909]: I1201 10:49:33.173458 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-central-agent" containerID="cri-o://e40e70efd570d4c70827ada2d6d3936048ae8b9c8b8699790adaef2174acc49b" gracePeriod=30 Dec 01 10:49:33 crc kubenswrapper[4909]: I1201 10:49:33.173503 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="sg-core" containerID="cri-o://fe93d81b91ca9d9e62e11270385a1c14f38da2c4b38973c08dabea2b8c463377" gracePeriod=30 Dec 01 10:49:33 crc kubenswrapper[4909]: I1201 10:49:33.173547 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-notification-agent" containerID="cri-o://32a8aa49f1ad593334f062df2f7f8aad1cacaa47733a54774208e364c57632c2" gracePeriod=30 Dec 01 10:49:33 crc kubenswrapper[4909]: I1201 10:49:33.173485 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="proxy-httpd" containerID="cri-o://014d158da905d1fc5df65a2ea337e84fd6edfd9581eccce41205224a9c4bd394" gracePeriod=30 Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.329008 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09ab22ab-a48e-4e11-b498-9f78812079b1","Type":"ContainerStarted","Data":"567b4b1b02056d8c839b828fbfc289a25f537ff1a6ec207d077b6480b44068f7"} Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.334914 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerID="014d158da905d1fc5df65a2ea337e84fd6edfd9581eccce41205224a9c4bd394" exitCode=0 Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.334948 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerID="fe93d81b91ca9d9e62e11270385a1c14f38da2c4b38973c08dabea2b8c463377" exitCode=2 Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.334958 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerID="e40e70efd570d4c70827ada2d6d3936048ae8b9c8b8699790adaef2174acc49b" exitCode=0 Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.334981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerDied","Data":"014d158da905d1fc5df65a2ea337e84fd6edfd9581eccce41205224a9c4bd394"} Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.335009 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerDied","Data":"fe93d81b91ca9d9e62e11270385a1c14f38da2c4b38973c08dabea2b8c463377"} Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.335022 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerDied","Data":"e40e70efd570d4c70827ada2d6d3936048ae8b9c8b8699790adaef2174acc49b"} Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.523552 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.964423735 podStartE2EDuration="12.523523868s" podCreationTimestamp="2025-12-01 10:49:22 +0000 UTC" firstStartedPulling="2025-12-01 10:49:23.897182865 +0000 UTC m=+1081.131653763" lastFinishedPulling="2025-12-01 10:49:33.456282998 +0000 UTC m=+1090.690753896" observedRunningTime="2025-12-01 10:49:34.356158083 +0000 UTC m=+1091.590628991" watchObservedRunningTime="2025-12-01 10:49:34.523523868 +0000 UTC m=+1091.757994766" Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.531084 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:34 crc kubenswrapper[4909]: I1201 10:49:34.531441 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" containerName="kube-state-metrics" containerID="cri-o://89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2" gracePeriod=30 Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.065192 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.162010 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46dht\" (UniqueName: \"kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht\") pod \"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8\" (UID: \"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8\") " Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.170085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht" (OuterVolumeSpecName: "kube-api-access-46dht") pod "1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" (UID: "1b2b1919-713a-48ec-9bed-34d7d2c8bfc8"). InnerVolumeSpecName "kube-api-access-46dht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.264296 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46dht\" (UniqueName: \"kubernetes.io/projected/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8-kube-api-access-46dht\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.346043 4909 generic.go:334] "Generic (PLEG): container finished" podID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" containerID="89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2" exitCode=2 Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.346097 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.346102 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8","Type":"ContainerDied","Data":"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2"} Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.346179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b2b1919-713a-48ec-9bed-34d7d2c8bfc8","Type":"ContainerDied","Data":"1c469f52da707e5f2f40d50769f73f9af0eadb41d65d9e559d641b989feae197"} Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.346206 4909 scope.go:117] "RemoveContainer" containerID="89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.367688 4909 scope.go:117] "RemoveContainer" containerID="89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2" Dec 01 10:49:35 crc kubenswrapper[4909]: E1201 10:49:35.368440 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2\": container with ID starting with 89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2 not found: ID does not exist" containerID="89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.368472 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2"} err="failed to get container status \"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2\": rpc error: code = NotFound desc = could not find container \"89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2\": container with ID starting with 89168ab99e02db4e1145039f649bc658aaaac026b1d21ce7661f786a43745ae2 not found: ID does not exist" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.380591 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.414661 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.443637 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:35 crc kubenswrapper[4909]: E1201 10:49:35.444235 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" containerName="kube-state-metrics" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.444258 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" containerName="kube-state-metrics" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.444462 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" containerName="kube-state-metrics" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.445234 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.453981 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.454239 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.462667 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.571329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.571382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pxxj\" (UniqueName: \"kubernetes.io/projected/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-api-access-8pxxj\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.571421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.571780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.673633 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.673753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.673792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pxxj\" (UniqueName: \"kubernetes.io/projected/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-api-access-8pxxj\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.673829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.681009 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.681168 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.681302 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b55ffd9-122b-4a3b-b592-99f375e261fe-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.701668 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pxxj\" (UniqueName: \"kubernetes.io/projected/7b55ffd9-122b-4a3b-b592-99f375e261fe-kube-api-access-8pxxj\") pod \"kube-state-metrics-0\" (UID: \"7b55ffd9-122b-4a3b-b592-99f375e261fe\") " pod="openstack/kube-state-metrics-0" Dec 01 10:49:35 crc kubenswrapper[4909]: I1201 10:49:35.764306 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.194230 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.194290 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.270742 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:49:36 crc kubenswrapper[4909]: W1201 10:49:36.284354 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b55ffd9_122b_4a3b_b592_99f375e261fe.slice/crio-3c126d3fdd3f583176aa3f495c3abf7ad000e9abbb73c4e41084500f52889484 WatchSource:0}: Error finding container 3c126d3fdd3f583176aa3f495c3abf7ad000e9abbb73c4e41084500f52889484: Status 404 returned error can't find the container with id 3c126d3fdd3f583176aa3f495c3abf7ad000e9abbb73c4e41084500f52889484 Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.365988 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b55ffd9-122b-4a3b-b592-99f375e261fe","Type":"ContainerStarted","Data":"3c126d3fdd3f583176aa3f495c3abf7ad000e9abbb73c4e41084500f52889484"} Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.370604 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerID="32a8aa49f1ad593334f062df2f7f8aad1cacaa47733a54774208e364c57632c2" exitCode=0 Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.370649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerDied","Data":"32a8aa49f1ad593334f062df2f7f8aad1cacaa47733a54774208e364c57632c2"} Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.528630 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.692951 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk9ws\" (UniqueName: \"kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693277 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693319 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.693452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle\") pod \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\" (UID: \"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac\") " Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.697440 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.698387 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.705013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws" (OuterVolumeSpecName: "kube-api-access-zk9ws") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "kube-api-access-zk9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.724278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts" (OuterVolumeSpecName: "scripts") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.738048 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.795143 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.795180 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk9ws\" (UniqueName: \"kubernetes.io/projected/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-kube-api-access-zk9ws\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.795189 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.795199 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.795207 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.817794 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.856956 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data" (OuterVolumeSpecName: "config-data") pod "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" (UID: "fe7f1b16-a8d7-4d13-a699-62e4d675f6ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.897287 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:36 crc kubenswrapper[4909]: I1201 10:49:36.897327 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.268281 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2b1919-713a-48ec-9bed-34d7d2c8bfc8" path="/var/lib/kubelet/pods/1b2b1919-713a-48ec-9bed-34d7d2c8bfc8/volumes" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.387568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7b55ffd9-122b-4a3b-b592-99f375e261fe","Type":"ContainerStarted","Data":"233f6df1ab935e2cc3c13c652c8b00918a959860920d3ea052da64dc6ce0cc01"} Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.387957 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.397806 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe7f1b16-a8d7-4d13-a699-62e4d675f6ac","Type":"ContainerDied","Data":"fed63f7609ab25c94a2d1f059d9919cdf58c1c1ee799fa908257a2af1660bc6d"} Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.397866 4909 scope.go:117] "RemoveContainer" containerID="014d158da905d1fc5df65a2ea337e84fd6edfd9581eccce41205224a9c4bd394" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.398100 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.416297 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.048755706 podStartE2EDuration="2.41627891s" podCreationTimestamp="2025-12-01 10:49:35 +0000 UTC" firstStartedPulling="2025-12-01 10:49:36.303693377 +0000 UTC m=+1093.538164275" lastFinishedPulling="2025-12-01 10:49:36.671216581 +0000 UTC m=+1093.905687479" observedRunningTime="2025-12-01 10:49:37.402975686 +0000 UTC m=+1094.637446614" watchObservedRunningTime="2025-12-01 10:49:37.41627891 +0000 UTC m=+1094.650749808" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.427937 4909 scope.go:117] "RemoveContainer" containerID="fe93d81b91ca9d9e62e11270385a1c14f38da2c4b38973c08dabea2b8c463377" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.440805 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.448084 4909 scope.go:117] "RemoveContainer" containerID="32a8aa49f1ad593334f062df2f7f8aad1cacaa47733a54774208e364c57632c2" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.453950 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.468822 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:37 crc kubenswrapper[4909]: E1201 10:49:37.469374 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-notification-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469393 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-notification-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: E1201 10:49:37.469424 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="proxy-httpd" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469432 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="proxy-httpd" Dec 01 10:49:37 crc kubenswrapper[4909]: E1201 10:49:37.469451 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-central-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469460 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-central-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: E1201 10:49:37.469483 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="sg-core" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469490 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="sg-core" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469712 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-central-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469733 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="ceilometer-notification-agent" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469746 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="sg-core" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.469759 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" containerName="proxy-httpd" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.471904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.474439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.474736 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.474894 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.475044 4909 scope.go:117] "RemoveContainer" containerID="e40e70efd570d4c70827ada2d6d3936048ae8b9c8b8699790adaef2174acc49b" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.480436 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.618928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.618988 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619182 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wqv2\" (UniqueName: \"kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619255 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.619296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721351 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721435 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721497 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wqv2\" (UniqueName: \"kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.721580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.722127 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.722573 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.728045 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.728143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.729067 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.731136 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.742258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.760860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wqv2\" (UniqueName: \"kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2\") pod \"ceilometer-0\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.808629 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:37 crc kubenswrapper[4909]: I1201 10:49:37.966695 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:38 crc kubenswrapper[4909]: I1201 10:49:38.260701 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:38 crc kubenswrapper[4909]: W1201 10:49:38.282276 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd40688c0_30f7_4659_aa2d_cb314ab383ab.slice/crio-e954eb5e6e77bd9b787e2cc13b9435a2ec6cdab99534258eedf3040aa6afd87d WatchSource:0}: Error finding container e954eb5e6e77bd9b787e2cc13b9435a2ec6cdab99534258eedf3040aa6afd87d: Status 404 returned error can't find the container with id e954eb5e6e77bd9b787e2cc13b9435a2ec6cdab99534258eedf3040aa6afd87d Dec 01 10:49:38 crc kubenswrapper[4909]: I1201 10:49:38.417037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerStarted","Data":"e954eb5e6e77bd9b787e2cc13b9435a2ec6cdab99534258eedf3040aa6afd87d"} Dec 01 10:49:38 crc kubenswrapper[4909]: I1201 10:49:38.998250 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8nwnt"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.000044 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.014165 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8nwnt"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.099547 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-85ccm"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.101041 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.118674 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-85ccm"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.152339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.152400 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbbc\" (UniqueName: \"kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.226665 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c69c-account-create-update-w655r"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.228464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.245859 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.248418 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c69c-account-create-update-w655r"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.254308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.254364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbbc\" (UniqueName: \"kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.254411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.254444 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5dm\" (UniqueName: \"kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.255436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.299913 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbbc\" (UniqueName: \"kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc\") pod \"nova-api-db-create-8nwnt\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.308321 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7f1b16-a8d7-4d13-a699-62e4d675f6ac" path="/var/lib/kubelet/pods/fe7f1b16-a8d7-4d13-a699-62e4d675f6ac/volumes" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.312153 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nhslm"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.313993 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.325079 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.336919 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nhslm"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.356480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp5dm\" (UniqueName: \"kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.356560 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9msmz\" (UniqueName: \"kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.356660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.356701 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.357390 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.386593 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp5dm\" (UniqueName: \"kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm\") pod \"nova-cell0-db-create-85ccm\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.422341 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f543-account-create-update-qnsrz"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.424791 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.429693 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.444451 4909 generic.go:334] "Generic (PLEG): container finished" podID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerID="ae29488b26cf736be146aa4a9c510b1d41815613b09aa6334ce501408a06929a" exitCode=137 Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.444500 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerDied","Data":"ae29488b26cf736be146aa4a9c510b1d41815613b09aa6334ce501408a06929a"} Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.449035 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f543-account-create-update-qnsrz"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.460958 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.461067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9msmz\" (UniqueName: \"kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.461142 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqlq\" (UniqueName: \"kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.461248 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.485720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.494806 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9msmz\" (UniqueName: \"kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz\") pod \"nova-api-c69c-account-create-update-w655r\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.563385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.563531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zxw\" (UniqueName: \"kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.563571 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.563598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqlq\" (UniqueName: \"kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.564998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.614773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqlq\" (UniqueName: \"kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq\") pod \"nova-cell1-db-create-nhslm\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.620115 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-303c-account-create-update-8ff88"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.621768 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.625367 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.644952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-303c-account-create-update-8ff88"] Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.646734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.678605 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.680122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zxw\" (UniqueName: \"kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.680155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.680788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.720812 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zxw\" (UniqueName: \"kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw\") pod \"nova-cell0-f543-account-create-update-qnsrz\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.723031 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.759670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.794338 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.796696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.796788 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.796860 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797058 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797074 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzfv\" (UniqueName: \"kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv\") pod \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\" (UID: \"56afb5f9-412e-4590-8ac7-9f86d79dd2e6\") " Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797343 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797413 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fpj\" (UniqueName: \"kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.797571 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.798429 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs" (OuterVolumeSpecName: "logs") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.811533 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.815052 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts" (OuterVolumeSpecName: "scripts") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.820147 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv" (OuterVolumeSpecName: "kube-api-access-fqzfv") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "kube-api-access-fqzfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.871583 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.875193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data" (OuterVolumeSpecName: "config-data") pod "56afb5f9-412e-4590-8ac7-9f86d79dd2e6" (UID: "56afb5f9-412e-4590-8ac7-9f86d79dd2e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.899845 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900020 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fpj\" (UniqueName: \"kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900140 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900155 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900167 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900179 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900208 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900219 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900230 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzfv\" (UniqueName: \"kubernetes.io/projected/56afb5f9-412e-4590-8ac7-9f86d79dd2e6-kube-api-access-fqzfv\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.900910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.927663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fpj\" (UniqueName: \"kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj\") pod \"nova-cell1-303c-account-create-update-8ff88\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:39 crc kubenswrapper[4909]: I1201 10:49:39.949922 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.031406 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8nwnt"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.317560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c69c-account-create-update-w655r"] Dec 01 10:49:40 crc kubenswrapper[4909]: W1201 10:49:40.353245 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ddd34b0_2faa_4d36_a8f6_485e2d05a71a.slice/crio-2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255 WatchSource:0}: Error finding container 2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255: Status 404 returned error can't find the container with id 2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255 Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.353756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-85ccm"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.475262 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerStarted","Data":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.485820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c69c-account-create-update-w655r" event={"ID":"39a13de5-4b8b-4ec1-b6ab-de297ac31eea","Type":"ContainerStarted","Data":"3e209541314a9527e07892001cef24ccc10fef3468073a29572c3cf48f7f15ef"} Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.493058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8nwnt" event={"ID":"3d8848ea-48fe-4b9c-9cd2-6935a6a28717","Type":"ContainerStarted","Data":"a13f30b62ac6d435ce9bb549be5523a9ab3299a52e25535075d1d94aa11bac16"} Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.494802 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85ccm" event={"ID":"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a","Type":"ContainerStarted","Data":"2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255"} Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.532182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"56afb5f9-412e-4590-8ac7-9f86d79dd2e6","Type":"ContainerDied","Data":"fe15492aa43bd376aa7c3b161217cd706236e86a02e583ec72f54016b95db5ce"} Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.532241 4909 scope.go:117] "RemoveContainer" containerID="ae29488b26cf736be146aa4a9c510b1d41815613b09aa6334ce501408a06929a" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.532475 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.541234 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-8nwnt" podStartSLOduration=2.541209007 podStartE2EDuration="2.541209007s" podCreationTimestamp="2025-12-01 10:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:40.533765225 +0000 UTC m=+1097.768236123" watchObservedRunningTime="2025-12-01 10:49:40.541209007 +0000 UTC m=+1097.775679925" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.643898 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.666898 4909 scope.go:117] "RemoveContainer" containerID="e7d929b88a2332a891bda3385d79c8a4ebd973860453e5ca1b7ff73aa97a9b6d" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.717862 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.761099 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:40 crc kubenswrapper[4909]: E1201 10:49:40.761546 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api-log" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.761560 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api-log" Dec 01 10:49:40 crc kubenswrapper[4909]: E1201 10:49:40.761611 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.761619 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.761845 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.761893 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" containerName="cinder-api-log" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.763071 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.769982 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.770508 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.780169 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f543-account-create-update-qnsrz"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.786219 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.794943 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nhslm"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.805060 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.830014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-303c-account-create-update-8ff88"] Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.938867 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-scripts\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-logs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939351 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939741 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6hc\" (UniqueName: \"kubernetes.io/projected/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-kube-api-access-7q6hc\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939806 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:40 crc kubenswrapper[4909]: I1201 10:49:40.939923 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.041450 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043381 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6hc\" (UniqueName: \"kubernetes.io/projected/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-kube-api-access-7q6hc\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043674 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-scripts\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-logs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.043789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.045442 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-logs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.045865 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.050626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.052106 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.055019 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.055599 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.056465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-config-data\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.058729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-scripts\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.066117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6hc\" (UniqueName: \"kubernetes.io/projected/2c19b6f5-fcf4-4655-bdaa-10257b92f6dd-kube-api-access-7q6hc\") pod \"cinder-api-0\" (UID: \"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd\") " pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.118722 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.275789 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56afb5f9-412e-4590-8ac7-9f86d79dd2e6" path="/var/lib/kubelet/pods/56afb5f9-412e-4590-8ac7-9f86d79dd2e6/volumes" Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.550822 4909 generic.go:334] "Generic (PLEG): container finished" podID="a15e2fbe-81d5-4d34-a5a4-e920de0c607e" containerID="93ed8cc7aeae497a62c813e5c64ec92413b7297909aadf7dde1d6e0ab7250fbe" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.551079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nhslm" event={"ID":"a15e2fbe-81d5-4d34-a5a4-e920de0c607e","Type":"ContainerDied","Data":"93ed8cc7aeae497a62c813e5c64ec92413b7297909aadf7dde1d6e0ab7250fbe"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.552105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nhslm" event={"ID":"a15e2fbe-81d5-4d34-a5a4-e920de0c607e","Type":"ContainerStarted","Data":"76639479ac33403aba6c688276bf92809165567a427fc7ede80cd360e50b1b27"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.554009 4909 generic.go:334] "Generic (PLEG): container finished" podID="39a13de5-4b8b-4ec1-b6ab-de297ac31eea" containerID="5a2c50bac4437dcfa63d5d55e1fc7633aee69a8e30b09a902cbda218e65ee808" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.554083 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c69c-account-create-update-w655r" event={"ID":"39a13de5-4b8b-4ec1-b6ab-de297ac31eea","Type":"ContainerDied","Data":"5a2c50bac4437dcfa63d5d55e1fc7633aee69a8e30b09a902cbda218e65ee808"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.555525 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d8848ea-48fe-4b9c-9cd2-6935a6a28717" containerID="1b1c081ad011b830f52069e6d7413ec089c27f035353d85d6fab421846a252ee" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.555635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8nwnt" event={"ID":"3d8848ea-48fe-4b9c-9cd2-6935a6a28717","Type":"ContainerDied","Data":"1b1c081ad011b830f52069e6d7413ec089c27f035353d85d6fab421846a252ee"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.560234 4909 generic.go:334] "Generic (PLEG): container finished" podID="0c829e27-1fd3-4d20-a4c8-080580ea341d" containerID="9d36db06fa89f96e9752917fb917370a1f9fa4d5705c48b69a01a28e1838ff13" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.560397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" event={"ID":"0c829e27-1fd3-4d20-a4c8-080580ea341d","Type":"ContainerDied","Data":"9d36db06fa89f96e9752917fb917370a1f9fa4d5705c48b69a01a28e1838ff13"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.560490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" event={"ID":"0c829e27-1fd3-4d20-a4c8-080580ea341d","Type":"ContainerStarted","Data":"a52c043d5a2359ac427f313b0dd44b9178eb0d1287febb8c7744e0b767efffc4"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.567497 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" containerID="fd8f0f37f38583ab1dd43125ff22588bf9286608d390787c80430153ee02f484" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.567634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85ccm" event={"ID":"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a","Type":"ContainerDied","Data":"fd8f0f37f38583ab1dd43125ff22588bf9286608d390787c80430153ee02f484"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.580772 4909 generic.go:334] "Generic (PLEG): container finished" podID="17519694-f981-49fe-8579-c037e0afd59a" containerID="fa83846e3bbdcf2223b2a1232cf6495e7cb0f22e17bac29575877582115d6dfd" exitCode=0 Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.580908 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-303c-account-create-update-8ff88" event={"ID":"17519694-f981-49fe-8579-c037e0afd59a","Type":"ContainerDied","Data":"fa83846e3bbdcf2223b2a1232cf6495e7cb0f22e17bac29575877582115d6dfd"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.580942 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-303c-account-create-update-8ff88" event={"ID":"17519694-f981-49fe-8579-c037e0afd59a","Type":"ContainerStarted","Data":"65c881a5d031a6e13d4fbc388e4147c08975a3ca68284d57e2513ee0dc941c51"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.589466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerStarted","Data":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.589541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerStarted","Data":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} Dec 01 10:49:41 crc kubenswrapper[4909]: I1201 10:49:41.641560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:49:42 crc kubenswrapper[4909]: I1201 10:49:42.604493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd","Type":"ContainerStarted","Data":"4c81b0a3e41d030deae3fe64767740761afdb6bfad3bd7b30732dc3b018654de"} Dec 01 10:49:42 crc kubenswrapper[4909]: I1201 10:49:42.604809 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd","Type":"ContainerStarted","Data":"4d49c5a4503604593a5f579d97f7981095719a256183b24b2f13fc4a56ef094d"} Dec 01 10:49:42 crc kubenswrapper[4909]: I1201 10:49:42.984668 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.101172 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fpj\" (UniqueName: \"kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj\") pod \"17519694-f981-49fe-8579-c037e0afd59a\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.101392 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts\") pod \"17519694-f981-49fe-8579-c037e0afd59a\" (UID: \"17519694-f981-49fe-8579-c037e0afd59a\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.102567 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17519694-f981-49fe-8579-c037e0afd59a" (UID: "17519694-f981-49fe-8579-c037e0afd59a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.103150 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17519694-f981-49fe-8579-c037e0afd59a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.119037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj" (OuterVolumeSpecName: "kube-api-access-s5fpj") pod "17519694-f981-49fe-8579-c037e0afd59a" (UID: "17519694-f981-49fe-8579-c037e0afd59a"). InnerVolumeSpecName "kube-api-access-s5fpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.205973 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fpj\" (UniqueName: \"kubernetes.io/projected/17519694-f981-49fe-8579-c037e0afd59a-kube-api-access-s5fpj\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.262678 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.278051 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.297897 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.328142 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420079 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjbbc\" (UniqueName: \"kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc\") pod \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420179 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts\") pod \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp5dm\" (UniqueName: \"kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm\") pod \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\" (UID: \"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts\") pod \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts\") pod \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\" (UID: \"3d8848ea-48fe-4b9c-9cd2-6935a6a28717\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420499 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5zxw\" (UniqueName: \"kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw\") pod \"0c829e27-1fd3-4d20-a4c8-080580ea341d\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420600 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts\") pod \"0c829e27-1fd3-4d20-a4c8-080580ea341d\" (UID: \"0c829e27-1fd3-4d20-a4c8-080580ea341d\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.420687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9msmz\" (UniqueName: \"kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz\") pod \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\" (UID: \"39a13de5-4b8b-4ec1-b6ab-de297ac31eea\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.423299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d8848ea-48fe-4b9c-9cd2-6935a6a28717" (UID: "3d8848ea-48fe-4b9c-9cd2-6935a6a28717"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.425026 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.425680 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39a13de5-4b8b-4ec1-b6ab-de297ac31eea" (UID: "39a13de5-4b8b-4ec1-b6ab-de297ac31eea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.425681 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" (UID: "1ddd34b0-2faa-4d36-a8f6-485e2d05a71a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.426174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c829e27-1fd3-4d20-a4c8-080580ea341d" (UID: "0c829e27-1fd3-4d20-a4c8-080580ea341d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.429348 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc" (OuterVolumeSpecName: "kube-api-access-bjbbc") pod "3d8848ea-48fe-4b9c-9cd2-6935a6a28717" (UID: "3d8848ea-48fe-4b9c-9cd2-6935a6a28717"). InnerVolumeSpecName "kube-api-access-bjbbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.430200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw" (OuterVolumeSpecName: "kube-api-access-t5zxw") pod "0c829e27-1fd3-4d20-a4c8-080580ea341d" (UID: "0c829e27-1fd3-4d20-a4c8-080580ea341d"). InnerVolumeSpecName "kube-api-access-t5zxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.434240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm" (OuterVolumeSpecName: "kube-api-access-pp5dm") pod "1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" (UID: "1ddd34b0-2faa-4d36-a8f6-485e2d05a71a"). InnerVolumeSpecName "kube-api-access-pp5dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.436695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz" (OuterVolumeSpecName: "kube-api-access-9msmz") pod "39a13de5-4b8b-4ec1-b6ab-de297ac31eea" (UID: "39a13de5-4b8b-4ec1-b6ab-de297ac31eea"). InnerVolumeSpecName "kube-api-access-9msmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.515413 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526729 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjbbc\" (UniqueName: \"kubernetes.io/projected/3d8848ea-48fe-4b9c-9cd2-6935a6a28717-kube-api-access-bjbbc\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526764 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526774 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp5dm\" (UniqueName: \"kubernetes.io/projected/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a-kube-api-access-pp5dm\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526783 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526792 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5zxw\" (UniqueName: \"kubernetes.io/projected/0c829e27-1fd3-4d20-a4c8-080580ea341d-kube-api-access-t5zxw\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526800 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c829e27-1fd3-4d20-a4c8-080580ea341d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.526808 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9msmz\" (UniqueName: \"kubernetes.io/projected/39a13de5-4b8b-4ec1-b6ab-de297ac31eea-kube-api-access-9msmz\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.627527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts\") pod \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.627734 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvqlq\" (UniqueName: \"kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq\") pod \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\" (UID: \"a15e2fbe-81d5-4d34-a5a4-e920de0c607e\") " Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.629135 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a15e2fbe-81d5-4d34-a5a4-e920de0c607e" (UID: "a15e2fbe-81d5-4d34-a5a4-e920de0c607e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.631413 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8nwnt" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.632765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8nwnt" event={"ID":"3d8848ea-48fe-4b9c-9cd2-6935a6a28717","Type":"ContainerDied","Data":"a13f30b62ac6d435ce9bb549be5523a9ab3299a52e25535075d1d94aa11bac16"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.632811 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13f30b62ac6d435ce9bb549be5523a9ab3299a52e25535075d1d94aa11bac16" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.637158 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" event={"ID":"0c829e27-1fd3-4d20-a4c8-080580ea341d","Type":"ContainerDied","Data":"a52c043d5a2359ac427f313b0dd44b9178eb0d1287febb8c7744e0b767efffc4"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.637618 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52c043d5a2359ac427f313b0dd44b9178eb0d1287febb8c7744e0b767efffc4" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.637728 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f543-account-create-update-qnsrz" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.638404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq" (OuterVolumeSpecName: "kube-api-access-tvqlq") pod "a15e2fbe-81d5-4d34-a5a4-e920de0c607e" (UID: "a15e2fbe-81d5-4d34-a5a4-e920de0c607e"). InnerVolumeSpecName "kube-api-access-tvqlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.647777 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-85ccm" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.648543 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-85ccm" event={"ID":"1ddd34b0-2faa-4d36-a8f6-485e2d05a71a","Type":"ContainerDied","Data":"2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.648578 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9f56c227f3e0381bbdd024a92b00a799daf7c179bd7fc9f628a1e69a86b255" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.658536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-303c-account-create-update-8ff88" event={"ID":"17519694-f981-49fe-8579-c037e0afd59a","Type":"ContainerDied","Data":"65c881a5d031a6e13d4fbc388e4147c08975a3ca68284d57e2513ee0dc941c51"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.658604 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65c881a5d031a6e13d4fbc388e4147c08975a3ca68284d57e2513ee0dc941c51" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.658715 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-303c-account-create-update-8ff88" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.666938 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nhslm" event={"ID":"a15e2fbe-81d5-4d34-a5a4-e920de0c607e","Type":"ContainerDied","Data":"76639479ac33403aba6c688276bf92809165567a427fc7ede80cd360e50b1b27"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.667007 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76639479ac33403aba6c688276bf92809165567a427fc7ede80cd360e50b1b27" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.667090 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nhslm" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.683745 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c69c-account-create-update-w655r" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.683816 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c69c-account-create-update-w655r" event={"ID":"39a13de5-4b8b-4ec1-b6ab-de297ac31eea","Type":"ContainerDied","Data":"3e209541314a9527e07892001cef24ccc10fef3468073a29572c3cf48f7f15ef"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.683900 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e209541314a9527e07892001cef24ccc10fef3468073a29572c3cf48f7f15ef" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.690482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2c19b6f5-fcf4-4655-bdaa-10257b92f6dd","Type":"ContainerStarted","Data":"4ebc6d3483b488fc662a3bc4d684620dcc0444daeedaf094c9925ac8e5366c7b"} Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.691006 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.743040 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvqlq\" (UniqueName: \"kubernetes.io/projected/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-kube-api-access-tvqlq\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:43 crc kubenswrapper[4909]: I1201 10:49:43.743066 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a15e2fbe-81d5-4d34-a5a4-e920de0c607e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.033150 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.03313028 podStartE2EDuration="4.03313028s" podCreationTimestamp="2025-12-01 10:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:49:43.716155583 +0000 UTC m=+1100.950626501" watchObservedRunningTime="2025-12-01 10:49:44.03313028 +0000 UTC m=+1101.267601178" Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.704806 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-central-agent" containerID="cri-o://650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" gracePeriod=30 Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.705356 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="proxy-httpd" containerID="cri-o://de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" gracePeriod=30 Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.705438 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="sg-core" containerID="cri-o://5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" gracePeriod=30 Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.705488 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-notification-agent" containerID="cri-o://6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" gracePeriod=30 Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.705778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerStarted","Data":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.705815 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:49:44 crc kubenswrapper[4909]: I1201 10:49:44.738278 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.863539327 podStartE2EDuration="7.738253865s" podCreationTimestamp="2025-12-01 10:49:37 +0000 UTC" firstStartedPulling="2025-12-01 10:49:38.285610342 +0000 UTC m=+1095.520081240" lastFinishedPulling="2025-12-01 10:49:43.16032488 +0000 UTC m=+1100.394795778" observedRunningTime="2025-12-01 10:49:44.735719256 +0000 UTC m=+1101.970190184" watchObservedRunningTime="2025-12-01 10:49:44.738253865 +0000 UTC m=+1101.972724773" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.461413 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.576698 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wqv2\" (UniqueName: \"kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.576773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.576832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.576858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.576951 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577003 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577092 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577250 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs\") pod \"d40688c0-30f7-4659-aa2d-cb314ab383ab\" (UID: \"d40688c0-30f7-4659-aa2d-cb314ab383ab\") " Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577806 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.577943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.584183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2" (OuterVolumeSpecName: "kube-api-access-9wqv2") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "kube-api-access-9wqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.587404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts" (OuterVolumeSpecName: "scripts") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.612404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.635204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.655297 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679524 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679557 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d40688c0-30f7-4659-aa2d-cb314ab383ab-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679568 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679580 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679588 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.679601 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wqv2\" (UniqueName: \"kubernetes.io/projected/d40688c0-30f7-4659-aa2d-cb314ab383ab-kube-api-access-9wqv2\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.692266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data" (OuterVolumeSpecName: "config-data") pod "d40688c0-30f7-4659-aa2d-cb314ab383ab" (UID: "d40688c0-30f7-4659-aa2d-cb314ab383ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.714956 4909 generic.go:334] "Generic (PLEG): container finished" podID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" exitCode=0 Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.714997 4909 generic.go:334] "Generic (PLEG): container finished" podID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" exitCode=2 Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715007 4909 generic.go:334] "Generic (PLEG): container finished" podID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" exitCode=0 Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715014 4909 generic.go:334] "Generic (PLEG): container finished" podID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" exitCode=0 Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerDied","Data":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerDied","Data":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerDied","Data":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerDied","Data":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d40688c0-30f7-4659-aa2d-cb314ab383ab","Type":"ContainerDied","Data":"e954eb5e6e77bd9b787e2cc13b9435a2ec6cdab99534258eedf3040aa6afd87d"} Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715130 4909 scope.go:117] "RemoveContainer" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.715166 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.742234 4909 scope.go:117] "RemoveContainer" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.758775 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.765372 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.769086 4909 scope.go:117] "RemoveContainer" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.781694 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d40688c0-30f7-4659-aa2d-cb314ab383ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787375 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="proxy-httpd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787801 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="proxy-httpd" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787822 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="sg-core" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787833 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="sg-core" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787847 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c829e27-1fd3-4d20-a4c8-080580ea341d" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787892 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c829e27-1fd3-4d20-a4c8-080580ea341d" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787907 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787913 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787923 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-notification-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787929 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-notification-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787936 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8848ea-48fe-4b9c-9cd2-6935a6a28717" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787942 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8848ea-48fe-4b9c-9cd2-6935a6a28717" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787962 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17519694-f981-49fe-8579-c037e0afd59a" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787968 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="17519694-f981-49fe-8579-c037e0afd59a" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787976 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15e2fbe-81d5-4d34-a5a4-e920de0c607e" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.787983 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15e2fbe-81d5-4d34-a5a4-e920de0c607e" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.787996 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-central-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788001 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-central-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.788017 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a13de5-4b8b-4ec1-b6ab-de297ac31eea" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788023 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a13de5-4b8b-4ec1-b6ab-de297ac31eea" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788181 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c829e27-1fd3-4d20-a4c8-080580ea341d" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788194 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a13de5-4b8b-4ec1-b6ab-de297ac31eea" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788204 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-central-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788210 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="ceilometer-notification-agent" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788220 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="sg-core" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788234 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" containerName="proxy-httpd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788322 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8848ea-48fe-4b9c-9cd2-6935a6a28717" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788337 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788359 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15e2fbe-81d5-4d34-a5a4-e920de0c607e" containerName="mariadb-database-create" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.788368 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="17519694-f981-49fe-8579-c037e0afd59a" containerName="mariadb-account-create-update" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.790092 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.790612 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.794025 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.794061 4909 scope.go:117] "RemoveContainer" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.794274 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.794440 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.812748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.848071 4909 scope.go:117] "RemoveContainer" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.852992 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": container with ID starting with de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd not found: ID does not exist" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.853041 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} err="failed to get container status \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": rpc error: code = NotFound desc = could not find container \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": container with ID starting with de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.853070 4909 scope.go:117] "RemoveContainer" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.858033 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": container with ID starting with 5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5 not found: ID does not exist" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.858085 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} err="failed to get container status \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": rpc error: code = NotFound desc = could not find container \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": container with ID starting with 5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.858116 4909 scope.go:117] "RemoveContainer" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.859152 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": container with ID starting with 6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b not found: ID does not exist" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859205 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} err="failed to get container status \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": rpc error: code = NotFound desc = could not find container \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": container with ID starting with 6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859242 4909 scope.go:117] "RemoveContainer" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: E1201 10:49:45.859527 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": container with ID starting with 650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6 not found: ID does not exist" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859553 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} err="failed to get container status \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": rpc error: code = NotFound desc = could not find container \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": container with ID starting with 650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859570 4909 scope.go:117] "RemoveContainer" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859774 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} err="failed to get container status \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": rpc error: code = NotFound desc = could not find container \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": container with ID starting with de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.859796 4909 scope.go:117] "RemoveContainer" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860051 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} err="failed to get container status \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": rpc error: code = NotFound desc = could not find container \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": container with ID starting with 5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860077 4909 scope.go:117] "RemoveContainer" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860265 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} err="failed to get container status \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": rpc error: code = NotFound desc = could not find container \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": container with ID starting with 6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860286 4909 scope.go:117] "RemoveContainer" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860488 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} err="failed to get container status \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": rpc error: code = NotFound desc = could not find container \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": container with ID starting with 650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860511 4909 scope.go:117] "RemoveContainer" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860714 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} err="failed to get container status \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": rpc error: code = NotFound desc = could not find container \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": container with ID starting with de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.860731 4909 scope.go:117] "RemoveContainer" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861264 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} err="failed to get container status \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": rpc error: code = NotFound desc = could not find container \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": container with ID starting with 5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861283 4909 scope.go:117] "RemoveContainer" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861610 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} err="failed to get container status \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": rpc error: code = NotFound desc = could not find container \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": container with ID starting with 6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861634 4909 scope.go:117] "RemoveContainer" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861946 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} err="failed to get container status \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": rpc error: code = NotFound desc = could not find container \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": container with ID starting with 650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.861973 4909 scope.go:117] "RemoveContainer" containerID="de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.862238 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd"} err="failed to get container status \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": rpc error: code = NotFound desc = could not find container \"de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd\": container with ID starting with de13ba8bb184d781418205a52f68a433f2b71643422f632f9dc787e8cae601fd not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.862259 4909 scope.go:117] "RemoveContainer" containerID="5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.862717 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5"} err="failed to get container status \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": rpc error: code = NotFound desc = could not find container \"5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5\": container with ID starting with 5bef6284d0e634d525c67fa476aa719326c60563fb1e2c36449403ac5062d2a5 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.862741 4909 scope.go:117] "RemoveContainer" containerID="6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.863885 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b"} err="failed to get container status \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": rpc error: code = NotFound desc = could not find container \"6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b\": container with ID starting with 6745b50014cc99a61a42ef0537d07467afab2b6d047dc793f0ee33734d30ab9b not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.863936 4909 scope.go:117] "RemoveContainer" containerID="650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.865414 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6"} err="failed to get container status \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": rpc error: code = NotFound desc = could not find container \"650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6\": container with ID starting with 650f8afd9a38cfd8e14e780515e28a5023423654a873f703cf33dec6557de2a6 not found: ID does not exist" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883386 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vf5s\" (UniqueName: \"kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883434 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883486 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883577 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.883691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.985733 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986091 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vf5s\" (UniqueName: \"kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986163 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986256 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.986329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.987417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.987720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.992581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.992717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.994180 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:45 crc kubenswrapper[4909]: I1201 10:49:45.994351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.000429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.013985 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vf5s\" (UniqueName: \"kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s\") pod \"ceilometer-0\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " pod="openstack/ceilometer-0" Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.109931 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.463923 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.582598 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:49:46 crc kubenswrapper[4909]: I1201 10:49:46.733307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerStarted","Data":"01aa55673e6a3e44f5584ce51399de0fdd7822446d03bc7d743fb9cc44d97871"} Dec 01 10:49:47 crc kubenswrapper[4909]: I1201 10:49:47.267687 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40688c0-30f7-4659-aa2d-cb314ab383ab" path="/var/lib/kubelet/pods/d40688c0-30f7-4659-aa2d-cb314ab383ab/volumes" Dec 01 10:49:47 crc kubenswrapper[4909]: I1201 10:49:47.744767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerStarted","Data":"70c3db63cada5186ae6ffca11add9dee9cb7ae0be9e164199e3dc20b2d990e50"} Dec 01 10:49:48 crc kubenswrapper[4909]: I1201 10:49:48.756047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerStarted","Data":"bcbf7ef8a1a283205a204880e9bb988a4c899b693044b726582178e91e44bc26"} Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.625809 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s699j"] Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.627521 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.662912 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.663068 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.663384 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rkftn" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.679347 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s699j"] Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.765613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.765697 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.766064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmjl\" (UniqueName: \"kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.766276 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.771426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerStarted","Data":"c54ec3b4fa9c6e47dce4e5a815f494905aa2c56fbab9adc52920a2d79dc18d1a"} Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.868698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.868755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.868845 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmjl\" (UniqueName: \"kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.868936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.876539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.880054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.883400 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.888210 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmjl\" (UniqueName: \"kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl\") pod \"nova-cell0-conductor-db-sync-s699j\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:49 crc kubenswrapper[4909]: I1201 10:49:49.985191 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.474952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s699j"] Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785151 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerStarted","Data":"e8799d07babc2b86da86ba8d7220b7c972036b9b65f45b2387bdb92657b227ea"} Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785284 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-central-agent" containerID="cri-o://70c3db63cada5186ae6ffca11add9dee9cb7ae0be9e164199e3dc20b2d990e50" gracePeriod=30 Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785317 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785362 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="proxy-httpd" containerID="cri-o://e8799d07babc2b86da86ba8d7220b7c972036b9b65f45b2387bdb92657b227ea" gracePeriod=30 Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785411 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="sg-core" containerID="cri-o://c54ec3b4fa9c6e47dce4e5a815f494905aa2c56fbab9adc52920a2d79dc18d1a" gracePeriod=30 Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.785449 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-notification-agent" containerID="cri-o://bcbf7ef8a1a283205a204880e9bb988a4c899b693044b726582178e91e44bc26" gracePeriod=30 Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.789979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s699j" event={"ID":"88326ebb-a888-4633-9459-114d0e1f2cc9","Type":"ContainerStarted","Data":"45169fa66a38bfabc0b4a396af46e3457a30fd2bbb63c7b174c1c2da34bd5aec"} Dec 01 10:49:50 crc kubenswrapper[4909]: I1201 10:49:50.813421 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.903636047 podStartE2EDuration="5.813401043s" podCreationTimestamp="2025-12-01 10:49:45 +0000 UTC" firstStartedPulling="2025-12-01 10:49:46.589849539 +0000 UTC m=+1103.824320437" lastFinishedPulling="2025-12-01 10:49:50.499614535 +0000 UTC m=+1107.734085433" observedRunningTime="2025-12-01 10:49:50.810955567 +0000 UTC m=+1108.045426475" watchObservedRunningTime="2025-12-01 10:49:50.813401043 +0000 UTC m=+1108.047871941" Dec 01 10:49:51 crc kubenswrapper[4909]: I1201 10:49:51.800286 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerID="c54ec3b4fa9c6e47dce4e5a815f494905aa2c56fbab9adc52920a2d79dc18d1a" exitCode=2 Dec 01 10:49:51 crc kubenswrapper[4909]: I1201 10:49:51.801385 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerID="bcbf7ef8a1a283205a204880e9bb988a4c899b693044b726582178e91e44bc26" exitCode=0 Dec 01 10:49:51 crc kubenswrapper[4909]: I1201 10:49:51.800361 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerDied","Data":"c54ec3b4fa9c6e47dce4e5a815f494905aa2c56fbab9adc52920a2d79dc18d1a"} Dec 01 10:49:51 crc kubenswrapper[4909]: I1201 10:49:51.801528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerDied","Data":"bcbf7ef8a1a283205a204880e9bb988a4c899b693044b726582178e91e44bc26"} Dec 01 10:49:53 crc kubenswrapper[4909]: I1201 10:49:53.278742 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 10:49:53 crc kubenswrapper[4909]: I1201 10:49:53.850400 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerID="70c3db63cada5186ae6ffca11add9dee9cb7ae0be9e164199e3dc20b2d990e50" exitCode=0 Dec 01 10:49:53 crc kubenswrapper[4909]: I1201 10:49:53.850447 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerDied","Data":"70c3db63cada5186ae6ffca11add9dee9cb7ae0be9e164199e3dc20b2d990e50"} Dec 01 10:49:58 crc kubenswrapper[4909]: I1201 10:49:58.917770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s699j" event={"ID":"88326ebb-a888-4633-9459-114d0e1f2cc9","Type":"ContainerStarted","Data":"5e5a91b7b7a48182688c61e0c7011c5391ffcd15791ae48afb0a4546fba6616f"} Dec 01 10:50:06 crc kubenswrapper[4909]: I1201 10:50:06.194180 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:50:06 crc kubenswrapper[4909]: I1201 10:50:06.195107 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:50:14 crc kubenswrapper[4909]: I1201 10:50:14.050803 4909 generic.go:334] "Generic (PLEG): container finished" podID="88326ebb-a888-4633-9459-114d0e1f2cc9" containerID="5e5a91b7b7a48182688c61e0c7011c5391ffcd15791ae48afb0a4546fba6616f" exitCode=0 Dec 01 10:50:14 crc kubenswrapper[4909]: I1201 10:50:14.050921 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s699j" event={"ID":"88326ebb-a888-4633-9459-114d0e1f2cc9","Type":"ContainerDied","Data":"5e5a91b7b7a48182688c61e0c7011c5391ffcd15791ae48afb0a4546fba6616f"} Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.367747 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.398823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data\") pod \"88326ebb-a888-4633-9459-114d0e1f2cc9\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.399210 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjmjl\" (UniqueName: \"kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl\") pod \"88326ebb-a888-4633-9459-114d0e1f2cc9\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.399445 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle\") pod \"88326ebb-a888-4633-9459-114d0e1f2cc9\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.399545 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts\") pod \"88326ebb-a888-4633-9459-114d0e1f2cc9\" (UID: \"88326ebb-a888-4633-9459-114d0e1f2cc9\") " Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.406004 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl" (OuterVolumeSpecName: "kube-api-access-kjmjl") pod "88326ebb-a888-4633-9459-114d0e1f2cc9" (UID: "88326ebb-a888-4633-9459-114d0e1f2cc9"). InnerVolumeSpecName "kube-api-access-kjmjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.406543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts" (OuterVolumeSpecName: "scripts") pod "88326ebb-a888-4633-9459-114d0e1f2cc9" (UID: "88326ebb-a888-4633-9459-114d0e1f2cc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.428704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data" (OuterVolumeSpecName: "config-data") pod "88326ebb-a888-4633-9459-114d0e1f2cc9" (UID: "88326ebb-a888-4633-9459-114d0e1f2cc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.434049 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88326ebb-a888-4633-9459-114d0e1f2cc9" (UID: "88326ebb-a888-4633-9459-114d0e1f2cc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.501851 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.501920 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjmjl\" (UniqueName: \"kubernetes.io/projected/88326ebb-a888-4633-9459-114d0e1f2cc9-kube-api-access-kjmjl\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.501938 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:15 crc kubenswrapper[4909]: I1201 10:50:15.501949 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88326ebb-a888-4633-9459-114d0e1f2cc9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.077613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s699j" event={"ID":"88326ebb-a888-4633-9459-114d0e1f2cc9","Type":"ContainerDied","Data":"45169fa66a38bfabc0b4a396af46e3457a30fd2bbb63c7b174c1c2da34bd5aec"} Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.077893 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45169fa66a38bfabc0b4a396af46e3457a30fd2bbb63c7b174c1c2da34bd5aec" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.077677 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s699j" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.122681 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.184806 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:50:16 crc kubenswrapper[4909]: E1201 10:50:16.185400 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88326ebb-a888-4633-9459-114d0e1f2cc9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.185422 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="88326ebb-a888-4633-9459-114d0e1f2cc9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.185671 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="88326ebb-a888-4633-9459-114d0e1f2cc9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.186422 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.188925 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rkftn" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.191479 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.206461 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.317037 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.317125 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7tth\" (UniqueName: \"kubernetes.io/projected/c254210a-6515-499c-b95c-1bf34961cf05-kube-api-access-c7tth\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.317166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.419964 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.420073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7tth\" (UniqueName: \"kubernetes.io/projected/c254210a-6515-499c-b95c-1bf34961cf05-kube-api-access-c7tth\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.420120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.425214 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.427583 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254210a-6515-499c-b95c-1bf34961cf05-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.438941 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7tth\" (UniqueName: \"kubernetes.io/projected/c254210a-6515-499c-b95c-1bf34961cf05-kube-api-access-c7tth\") pod \"nova-cell0-conductor-0\" (UID: \"c254210a-6515-499c-b95c-1bf34961cf05\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:16 crc kubenswrapper[4909]: I1201 10:50:16.504992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:17 crc kubenswrapper[4909]: I1201 10:50:17.013292 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:50:17 crc kubenswrapper[4909]: I1201 10:50:17.088754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c254210a-6515-499c-b95c-1bf34961cf05","Type":"ContainerStarted","Data":"510aa804ab46fac089d6761a22617b3c103a989db7a1ee9c26310e057ff0cbf9"} Dec 01 10:50:19 crc kubenswrapper[4909]: I1201 10:50:19.107630 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c254210a-6515-499c-b95c-1bf34961cf05","Type":"ContainerStarted","Data":"ebc653819008251aecce5f273227d21281d0ba5509f5926cb5c6ccdbb2b52e1e"} Dec 01 10:50:19 crc kubenswrapper[4909]: I1201 10:50:19.107985 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:19 crc kubenswrapper[4909]: I1201 10:50:19.129923 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.129903871 podStartE2EDuration="3.129903871s" podCreationTimestamp="2025-12-01 10:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:19.129842529 +0000 UTC m=+1136.364313447" watchObservedRunningTime="2025-12-01 10:50:19.129903871 +0000 UTC m=+1136.364374789" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.154862 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerID="e8799d07babc2b86da86ba8d7220b7c972036b9b65f45b2387bdb92657b227ea" exitCode=137 Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.155136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerDied","Data":"e8799d07babc2b86da86ba8d7220b7c972036b9b65f45b2387bdb92657b227ea"} Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.155886 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e390846-ddfc-4952-907e-8562a5bc63f8","Type":"ContainerDied","Data":"01aa55673e6a3e44f5584ce51399de0fdd7822446d03bc7d743fb9cc44d97871"} Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.155914 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01aa55673e6a3e44f5584ce51399de0fdd7822446d03bc7d743fb9cc44d97871" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.197672 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.312944 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vf5s\" (UniqueName: \"kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313101 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313168 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313257 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313324 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313354 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313382 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.313417 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml\") pod \"9e390846-ddfc-4952-907e-8562a5bc63f8\" (UID: \"9e390846-ddfc-4952-907e-8562a5bc63f8\") " Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.314229 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.314744 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.315691 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.325118 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts" (OuterVolumeSpecName: "scripts") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.325497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s" (OuterVolumeSpecName: "kube-api-access-2vf5s") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "kube-api-access-2vf5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.348159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.366205 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.395006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.413288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data" (OuterVolumeSpecName: "config-data") pod "9e390846-ddfc-4952-907e-8562a5bc63f8" (UID: "9e390846-ddfc-4952-907e-8562a5bc63f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.416617 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vf5s\" (UniqueName: \"kubernetes.io/projected/9e390846-ddfc-4952-907e-8562a5bc63f8-kube-api-access-2vf5s\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.416745 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.416919 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.417051 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e390846-ddfc-4952-907e-8562a5bc63f8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.417146 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.417219 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:21 crc kubenswrapper[4909]: I1201 10:50:21.417280 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e390846-ddfc-4952-907e-8562a5bc63f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.164444 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.200811 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.209598 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.225523 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:50:22 crc kubenswrapper[4909]: E1201 10:50:22.225957 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-notification-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.225976 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-notification-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: E1201 10:50:22.225993 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-central-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.225999 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-central-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: E1201 10:50:22.226012 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="proxy-httpd" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226019 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="proxy-httpd" Dec 01 10:50:22 crc kubenswrapper[4909]: E1201 10:50:22.226031 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="sg-core" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226038 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="sg-core" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226196 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="sg-core" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226213 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="proxy-httpd" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226222 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-notification-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.226233 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" containerName="ceilometer-central-agent" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.228020 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.231836 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.234420 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.235449 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.244547 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.334765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqpqd\" (UniqueName: \"kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335268 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335483 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.335983 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.336094 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.437967 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438052 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438081 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438142 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqpqd\" (UniqueName: \"kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438215 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.438677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.439032 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.444815 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.444909 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.454939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.461767 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.462842 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.465179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqpqd\" (UniqueName: \"kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd\") pod \"ceilometer-0\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " pod="openstack/ceilometer-0" Dec 01 10:50:22 crc kubenswrapper[4909]: I1201 10:50:22.545712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:50:23 crc kubenswrapper[4909]: I1201 10:50:23.009156 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:50:23 crc kubenswrapper[4909]: W1201 10:50:23.015170 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d6fd8c0_7164_43fe_b1c6_33cd35ac8dee.slice/crio-cb15814971d35c3c3877fd0f16d265edfaf16938755ed2833af50e93b05634ea WatchSource:0}: Error finding container cb15814971d35c3c3877fd0f16d265edfaf16938755ed2833af50e93b05634ea: Status 404 returned error can't find the container with id cb15814971d35c3c3877fd0f16d265edfaf16938755ed2833af50e93b05634ea Dec 01 10:50:23 crc kubenswrapper[4909]: I1201 10:50:23.175231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerStarted","Data":"cb15814971d35c3c3877fd0f16d265edfaf16938755ed2833af50e93b05634ea"} Dec 01 10:50:23 crc kubenswrapper[4909]: I1201 10:50:23.269733 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e390846-ddfc-4952-907e-8562a5bc63f8" path="/var/lib/kubelet/pods/9e390846-ddfc-4952-907e-8562a5bc63f8/volumes" Dec 01 10:50:24 crc kubenswrapper[4909]: I1201 10:50:24.188634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerStarted","Data":"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a"} Dec 01 10:50:25 crc kubenswrapper[4909]: I1201 10:50:25.199820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerStarted","Data":"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a"} Dec 01 10:50:26 crc kubenswrapper[4909]: I1201 10:50:26.211847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerStarted","Data":"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d"} Dec 01 10:50:26 crc kubenswrapper[4909]: I1201 10:50:26.531338 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.051967 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9whwl"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.053502 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.056740 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.058976 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.099327 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9whwl"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.129645 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.130105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.130309 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2pm\" (UniqueName: \"kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.130420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.231953 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.232452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv2pm\" (UniqueName: \"kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.232498 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.232551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.239077 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.243698 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.254412 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.255260 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.258197 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.270960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.276362 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.281389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv2pm\" (UniqueName: \"kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm\") pod \"nova-cell0-cell-mapping-9whwl\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.335405 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.335483 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jpzz\" (UniqueName: \"kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.335531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.335552 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.400642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.403955 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.405347 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.422322 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.434149 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.436055 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.446685 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.463790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.463950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jpzz\" (UniqueName: \"kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.464977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.465058 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.471330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.475292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.481342 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.503891 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.512703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.572835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.572916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.572978 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.573059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.573115 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwz7s\" (UniqueName: \"kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.573157 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqkl\" (UniqueName: \"kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.573187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.578393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jpzz\" (UniqueName: \"kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz\") pod \"nova-api-0\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.629309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.680458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.699496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwz7s\" (UniqueName: \"kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.699622 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqkl\" (UniqueName: \"kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.699673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.699782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.699812 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.747112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.753186 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.760782 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.763312 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.763658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.770762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.772793 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.772938 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqkl\" (UniqueName: \"kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.778522 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.780719 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.799118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwz7s\" (UniqueName: \"kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s\") pod \"nova-metadata-0\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " pod="openstack/nova-metadata-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.862477 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.874351 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.880596 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.901608 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.951904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmx5g\" (UniqueName: \"kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952094 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952217 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmprk\" (UniqueName: \"kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.952260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:27 crc kubenswrapper[4909]: I1201 10:50:27.959410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.002912 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.053896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.055975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmprk\" (UniqueName: \"kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.056429 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.056586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmx5g\" (UniqueName: \"kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.057769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.057975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.058257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.059583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.060049 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.059168 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.059386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.061973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.072717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.074867 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.088382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmprk\" (UniqueName: \"kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk\") pod \"dnsmasq-dns-566b5b7845-qzhsz\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.105860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmx5g\" (UniqueName: \"kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g\") pod \"nova-scheduler-0\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.157600 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.301479 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerStarted","Data":"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc"} Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.305671 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.353959 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.000386997 podStartE2EDuration="6.353923502s" podCreationTimestamp="2025-12-01 10:50:22 +0000 UTC" firstStartedPulling="2025-12-01 10:50:23.016820954 +0000 UTC m=+1140.251291852" lastFinishedPulling="2025-12-01 10:50:27.370357459 +0000 UTC m=+1144.604828357" observedRunningTime="2025-12-01 10:50:28.334947931 +0000 UTC m=+1145.569418829" watchObservedRunningTime="2025-12-01 10:50:28.353923502 +0000 UTC m=+1145.588394410" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.388070 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.426638 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.507007 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9whwl"] Dec 01 10:50:28 crc kubenswrapper[4909]: W1201 10:50:28.510518 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a43e441_37c5_4ada_9bb8_f610dd18a5f9.slice/crio-2007e53a3e3d63407478245d538c7ac48e3e57d5de43bd788e95fb0327f1ac8f WatchSource:0}: Error finding container 2007e53a3e3d63407478245d538c7ac48e3e57d5de43bd788e95fb0327f1ac8f: Status 404 returned error can't find the container with id 2007e53a3e3d63407478245d538c7ac48e3e57d5de43bd788e95fb0327f1ac8f Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.889976 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bqghh"] Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.891621 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.901462 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.905950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.932001 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:50:28 crc kubenswrapper[4909]: I1201 10:50:28.969963 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bqghh"] Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.005963 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.008456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clcd\" (UniqueName: \"kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.008542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.008568 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.008622 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.080081 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.111144 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.111197 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.111259 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.111338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clcd\" (UniqueName: \"kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.140688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clcd\" (UniqueName: \"kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.140765 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.142742 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.142786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data\") pod \"nova-cell1-conductor-db-sync-bqghh\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.165790 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:50:29 crc kubenswrapper[4909]: W1201 10:50:29.176810 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8e06da_87e5_467f_8938_0a349513a0c8.slice/crio-2850e5f1684993862511b6fcb2e986ac2f545c8c74b1f7382b22391bff450077 WatchSource:0}: Error finding container 2850e5f1684993862511b6fcb2e986ac2f545c8c74b1f7382b22391bff450077: Status 404 returned error can't find the container with id 2850e5f1684993862511b6fcb2e986ac2f545c8c74b1f7382b22391bff450077 Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.230526 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.343521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf02fa2a-a364-448c-be13-3e2ec28e591f","Type":"ContainerStarted","Data":"9ed66554aa3d01e06d43d78e074d2a40d841a92d43ba23420149e5b71690b3c1"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.349650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9whwl" event={"ID":"4556e9cd-36ab-411b-9065-0b74a4c426a5","Type":"ContainerStarted","Data":"cf93c045bda5c27f82b5e9a4fa58e99c37c3561a467b54813269308f17bfdf9c"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.349697 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9whwl" event={"ID":"4556e9cd-36ab-411b-9065-0b74a4c426a5","Type":"ContainerStarted","Data":"9d26e1bac7fc2b0f5f6ef09aaf2f483cc5d2b55ea657bf6c5639fa70f570a472"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.363399 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc","Type":"ContainerStarted","Data":"a90f3e957679e4d63e36bbd8ee9d05524afbf9d6f5d09c45a8ba7174561030da"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.387387 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9whwl" podStartSLOduration=2.387354528 podStartE2EDuration="2.387354528s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:29.372461844 +0000 UTC m=+1146.606932752" watchObservedRunningTime="2025-12-01 10:50:29.387354528 +0000 UTC m=+1146.621825426" Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.396212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerStarted","Data":"2007e53a3e3d63407478245d538c7ac48e3e57d5de43bd788e95fb0327f1ac8f"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.399782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" event={"ID":"2e8e06da-87e5-467f-8938-0a349513a0c8","Type":"ContainerStarted","Data":"2850e5f1684993862511b6fcb2e986ac2f545c8c74b1f7382b22391bff450077"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.426070 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerStarted","Data":"a1441d55d779a8ae89eebfa007737f859ffc64e778a96d37b61f6209e44f81a0"} Dec 01 10:50:29 crc kubenswrapper[4909]: I1201 10:50:29.838469 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bqghh"] Dec 01 10:50:29 crc kubenswrapper[4909]: W1201 10:50:29.862522 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ddd315c_8e96_40fd_ba04_48bf855ac533.slice/crio-a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6 WatchSource:0}: Error finding container a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6: Status 404 returned error can't find the container with id a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6 Dec 01 10:50:30 crc kubenswrapper[4909]: I1201 10:50:30.448288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bqghh" event={"ID":"8ddd315c-8e96-40fd-ba04-48bf855ac533","Type":"ContainerStarted","Data":"3ada28d3082887f0c23f4cabfbfbb741cea5332095b300be69dcd94609ddd8fc"} Dec 01 10:50:30 crc kubenswrapper[4909]: I1201 10:50:30.448649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bqghh" event={"ID":"8ddd315c-8e96-40fd-ba04-48bf855ac533","Type":"ContainerStarted","Data":"a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6"} Dec 01 10:50:30 crc kubenswrapper[4909]: I1201 10:50:30.466357 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerID="71f0ebd1e2f44c31728aa1f855063237d4dd258bf5ca23184ca4f5ebfc7b3532" exitCode=0 Dec 01 10:50:30 crc kubenswrapper[4909]: I1201 10:50:30.471358 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" event={"ID":"2e8e06da-87e5-467f-8938-0a349513a0c8","Type":"ContainerDied","Data":"71f0ebd1e2f44c31728aa1f855063237d4dd258bf5ca23184ca4f5ebfc7b3532"} Dec 01 10:50:30 crc kubenswrapper[4909]: I1201 10:50:30.476539 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bqghh" podStartSLOduration=2.476515652 podStartE2EDuration="2.476515652s" podCreationTimestamp="2025-12-01 10:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:30.474061815 +0000 UTC m=+1147.708532723" watchObservedRunningTime="2025-12-01 10:50:30.476515652 +0000 UTC m=+1147.710986550" Dec 01 10:50:31 crc kubenswrapper[4909]: I1201 10:50:31.598180 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:31 crc kubenswrapper[4909]: I1201 10:50:31.681490 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.504611 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerStarted","Data":"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753"} Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.511185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf02fa2a-a364-448c-be13-3e2ec28e591f","Type":"ContainerStarted","Data":"e2312592bf88aaf0b98d60601b816a56d1ee3d31d786bfd6630686b55df85c37"} Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.511715 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cf02fa2a-a364-448c-be13-3e2ec28e591f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e2312592bf88aaf0b98d60601b816a56d1ee3d31d786bfd6630686b55df85c37" gracePeriod=30 Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.523429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc","Type":"ContainerStarted","Data":"79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398"} Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.527842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerStarted","Data":"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58"} Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.539468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" event={"ID":"2e8e06da-87e5-467f-8938-0a349513a0c8","Type":"ContainerStarted","Data":"a26ae3f80b5c864852d35d53a6296497e0fc4ffc4f14c21a11ce96126294dd1d"} Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.540316 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.563471 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.580896591 podStartE2EDuration="6.563443614s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="2025-12-01 10:50:29.00940052 +0000 UTC m=+1146.243871418" lastFinishedPulling="2025-12-01 10:50:32.991947543 +0000 UTC m=+1150.226418441" observedRunningTime="2025-12-01 10:50:33.534399729 +0000 UTC m=+1150.768870637" watchObservedRunningTime="2025-12-01 10:50:33.563443614 +0000 UTC m=+1150.797914522" Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.563713 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.655838726 podStartE2EDuration="6.5636426s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="2025-12-01 10:50:29.088050531 +0000 UTC m=+1146.322521429" lastFinishedPulling="2025-12-01 10:50:32.995854395 +0000 UTC m=+1150.230325303" observedRunningTime="2025-12-01 10:50:33.550547752 +0000 UTC m=+1150.785018670" watchObservedRunningTime="2025-12-01 10:50:33.5636426 +0000 UTC m=+1150.798113518" Dec 01 10:50:33 crc kubenswrapper[4909]: I1201 10:50:33.582630 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" podStartSLOduration=6.582612031 podStartE2EDuration="6.582612031s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:33.577681417 +0000 UTC m=+1150.812152315" watchObservedRunningTime="2025-12-01 10:50:33.582612031 +0000 UTC m=+1150.817082929" Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.560214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerStarted","Data":"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654"} Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.560584 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-log" containerID="cri-o://d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" gracePeriod=30 Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.561009 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-metadata" containerID="cri-o://deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" gracePeriod=30 Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.565515 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerStarted","Data":"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3"} Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.593615 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.506762284 podStartE2EDuration="7.593589197s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="2025-12-01 10:50:28.905193942 +0000 UTC m=+1146.139664840" lastFinishedPulling="2025-12-01 10:50:32.992020855 +0000 UTC m=+1150.226491753" observedRunningTime="2025-12-01 10:50:34.593480694 +0000 UTC m=+1151.827951592" watchObservedRunningTime="2025-12-01 10:50:34.593589197 +0000 UTC m=+1151.828060095" Dec 01 10:50:34 crc kubenswrapper[4909]: I1201 10:50:34.623387 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.159842243 podStartE2EDuration="7.623361396s" podCreationTimestamp="2025-12-01 10:50:27 +0000 UTC" firstStartedPulling="2025-12-01 10:50:28.532208628 +0000 UTC m=+1145.766679526" lastFinishedPulling="2025-12-01 10:50:32.995727781 +0000 UTC m=+1150.230198679" observedRunningTime="2025-12-01 10:50:34.620681712 +0000 UTC m=+1151.855152610" watchObservedRunningTime="2025-12-01 10:50:34.623361396 +0000 UTC m=+1151.857832294" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.523107 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578656 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578574 4909 generic.go:334] "Generic (PLEG): container finished" podID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerID="deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" exitCode=0 Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578743 4909 generic.go:334] "Generic (PLEG): container finished" podID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerID="d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" exitCode=143 Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578709 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerDied","Data":"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654"} Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578810 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerDied","Data":"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753"} Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578830 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3a8241c-b729-4b4f-96b1-10cf76a9e548","Type":"ContainerDied","Data":"a1441d55d779a8ae89eebfa007737f859ffc64e778a96d37b61f6209e44f81a0"} Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.578849 4909 scope.go:117] "RemoveContainer" containerID="deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.617261 4909 scope.go:117] "RemoveContainer" containerID="d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.643272 4909 scope.go:117] "RemoveContainer" containerID="deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" Dec 01 10:50:35 crc kubenswrapper[4909]: E1201 10:50:35.643801 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654\": container with ID starting with deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654 not found: ID does not exist" containerID="deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.643845 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654"} err="failed to get container status \"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654\": rpc error: code = NotFound desc = could not find container \"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654\": container with ID starting with deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654 not found: ID does not exist" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.643893 4909 scope.go:117] "RemoveContainer" containerID="d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" Dec 01 10:50:35 crc kubenswrapper[4909]: E1201 10:50:35.644349 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753\": container with ID starting with d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753 not found: ID does not exist" containerID="d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.644378 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753"} err="failed to get container status \"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753\": rpc error: code = NotFound desc = could not find container \"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753\": container with ID starting with d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753 not found: ID does not exist" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.644395 4909 scope.go:117] "RemoveContainer" containerID="deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.644767 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654"} err="failed to get container status \"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654\": rpc error: code = NotFound desc = could not find container \"deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654\": container with ID starting with deb0e17fdd02e88e97d2375072cd3352a0d3769e6145691bf1c774e7f4d65654 not found: ID does not exist" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.644792 4909 scope.go:117] "RemoveContainer" containerID="d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.645050 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753"} err="failed to get container status \"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753\": rpc error: code = NotFound desc = could not find container \"d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753\": container with ID starting with d101e4d05bff9a0da8411169694ad83f38723ed9ef4d0b042ce0052425c1d753 not found: ID does not exist" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.656152 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs\") pod \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.656280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle\") pod \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.656474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwz7s\" (UniqueName: \"kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s\") pod \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.656521 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data\") pod \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\" (UID: \"d3a8241c-b729-4b4f-96b1-10cf76a9e548\") " Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.656798 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs" (OuterVolumeSpecName: "logs") pod "d3a8241c-b729-4b4f-96b1-10cf76a9e548" (UID: "d3a8241c-b729-4b4f-96b1-10cf76a9e548"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.657057 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a8241c-b729-4b4f-96b1-10cf76a9e548-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.663477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s" (OuterVolumeSpecName: "kube-api-access-jwz7s") pod "d3a8241c-b729-4b4f-96b1-10cf76a9e548" (UID: "d3a8241c-b729-4b4f-96b1-10cf76a9e548"). InnerVolumeSpecName "kube-api-access-jwz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.688559 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data" (OuterVolumeSpecName: "config-data") pod "d3a8241c-b729-4b4f-96b1-10cf76a9e548" (UID: "d3a8241c-b729-4b4f-96b1-10cf76a9e548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.692279 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3a8241c-b729-4b4f-96b1-10cf76a9e548" (UID: "d3a8241c-b729-4b4f-96b1-10cf76a9e548"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.758958 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.759025 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwz7s\" (UniqueName: \"kubernetes.io/projected/d3a8241c-b729-4b4f-96b1-10cf76a9e548-kube-api-access-jwz7s\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.759040 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a8241c-b729-4b4f-96b1-10cf76a9e548-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.915662 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.925835 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.938379 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:35 crc kubenswrapper[4909]: E1201 10:50:35.938820 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-metadata" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.938860 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-metadata" Dec 01 10:50:35 crc kubenswrapper[4909]: E1201 10:50:35.938910 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-log" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.938918 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-log" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.939096 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-metadata" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.939114 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" containerName="nova-metadata-log" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.940039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.944020 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.944042 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:50:35 crc kubenswrapper[4909]: I1201 10:50:35.964126 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.068441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.068704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.068759 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.068781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.068800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdv22\" (UniqueName: \"kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.170520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdv22\" (UniqueName: \"kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.171005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.171161 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.171317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.171451 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.171938 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.177707 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.178070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.179838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.192760 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdv22\" (UniqueName: \"kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22\") pod \"nova-metadata-0\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.193308 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.193375 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.193426 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.194273 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.194336 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1" gracePeriod=600 Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.267207 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.603437 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1" exitCode=0 Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.603496 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1"} Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.604239 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95"} Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.604278 4909 scope.go:117] "RemoveContainer" containerID="421cf8f5c478fd334e97e45775a9bedfa323e6a4c50a049b81ebf8da31dc53c8" Dec 01 10:50:36 crc kubenswrapper[4909]: W1201 10:50:36.776002 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dd6c772_cf8d_467f_a3c9_f14614060ac1.slice/crio-6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb WatchSource:0}: Error finding container 6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb: Status 404 returned error can't find the container with id 6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb Dec 01 10:50:36 crc kubenswrapper[4909]: I1201 10:50:36.778340 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.269919 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a8241c-b729-4b4f-96b1-10cf76a9e548" path="/var/lib/kubelet/pods/d3a8241c-b729-4b4f-96b1-10cf76a9e548/volumes" Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.626747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerStarted","Data":"ee0e246e2aa5f7681950d03ef0bc3765d8607a071ec4f2f64e996a6092815529"} Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.627278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerStarted","Data":"2f30d210612764c2f454c0c7a932a4406a8b86e4f16433529990a56ac7d87309"} Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.627299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerStarted","Data":"6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb"} Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.630022 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.630073 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.655411 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.655381307 podStartE2EDuration="2.655381307s" podCreationTimestamp="2025-12-01 10:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:37.649232085 +0000 UTC m=+1154.883703003" watchObservedRunningTime="2025-12-01 10:50:37.655381307 +0000 UTC m=+1154.889852225" Dec 01 10:50:37 crc kubenswrapper[4909]: I1201 10:50:37.960360 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.158522 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.158989 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.211535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.390154 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.465993 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.466766 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="dnsmasq-dns" containerID="cri-o://aad1d773964d01a8539bc400576046879713b710f8f394e8c490217d5f3eb00d" gracePeriod=10 Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.666823 4909 generic.go:334] "Generic (PLEG): container finished" podID="4556e9cd-36ab-411b-9065-0b74a4c426a5" containerID="cf93c045bda5c27f82b5e9a4fa58e99c37c3561a467b54813269308f17bfdf9c" exitCode=0 Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.667007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9whwl" event={"ID":"4556e9cd-36ab-411b-9065-0b74a4c426a5","Type":"ContainerDied","Data":"cf93c045bda5c27f82b5e9a4fa58e99c37c3561a467b54813269308f17bfdf9c"} Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.680273 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.681232 4909 generic.go:334] "Generic (PLEG): container finished" podID="8ddd315c-8e96-40fd-ba04-48bf855ac533" containerID="3ada28d3082887f0c23f4cabfbfbb741cea5332095b300be69dcd94609ddd8fc" exitCode=0 Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.681342 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bqghh" event={"ID":"8ddd315c-8e96-40fd-ba04-48bf855ac533","Type":"ContainerDied","Data":"3ada28d3082887f0c23f4cabfbfbb741cea5332095b300be69dcd94609ddd8fc"} Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.690813 4909 generic.go:334] "Generic (PLEG): container finished" podID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerID="aad1d773964d01a8539bc400576046879713b710f8f394e8c490217d5f3eb00d" exitCode=0 Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.692976 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" event={"ID":"06f87308-e13d-4bc3-89ed-c0ec275c4824","Type":"ContainerDied","Data":"aad1d773964d01a8539bc400576046879713b710f8f394e8c490217d5f3eb00d"} Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.722604 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.168:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:38 crc kubenswrapper[4909]: I1201 10:50:38.759113 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.246938 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.372115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc\") pod \"06f87308-e13d-4bc3-89ed-c0ec275c4824\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.372297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb\") pod \"06f87308-e13d-4bc3-89ed-c0ec275c4824\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.372331 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config\") pod \"06f87308-e13d-4bc3-89ed-c0ec275c4824\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.372392 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb\") pod \"06f87308-e13d-4bc3-89ed-c0ec275c4824\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.372467 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2knfw\" (UniqueName: \"kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw\") pod \"06f87308-e13d-4bc3-89ed-c0ec275c4824\" (UID: \"06f87308-e13d-4bc3-89ed-c0ec275c4824\") " Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.380158 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw" (OuterVolumeSpecName: "kube-api-access-2knfw") pod "06f87308-e13d-4bc3-89ed-c0ec275c4824" (UID: "06f87308-e13d-4bc3-89ed-c0ec275c4824"). InnerVolumeSpecName "kube-api-access-2knfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.436964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config" (OuterVolumeSpecName: "config") pod "06f87308-e13d-4bc3-89ed-c0ec275c4824" (UID: "06f87308-e13d-4bc3-89ed-c0ec275c4824"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.456596 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06f87308-e13d-4bc3-89ed-c0ec275c4824" (UID: "06f87308-e13d-4bc3-89ed-c0ec275c4824"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.459442 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06f87308-e13d-4bc3-89ed-c0ec275c4824" (UID: "06f87308-e13d-4bc3-89ed-c0ec275c4824"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.476673 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.476721 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.476732 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.476745 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2knfw\" (UniqueName: \"kubernetes.io/projected/06f87308-e13d-4bc3-89ed-c0ec275c4824-kube-api-access-2knfw\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.479153 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06f87308-e13d-4bc3-89ed-c0ec275c4824" (UID: "06f87308-e13d-4bc3-89ed-c0ec275c4824"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.579101 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06f87308-e13d-4bc3-89ed-c0ec275c4824-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.703997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" event={"ID":"06f87308-e13d-4bc3-89ed-c0ec275c4824","Type":"ContainerDied","Data":"86ef8eedcbeaf05b9bdb60e6285528f6ce6275052823f8e7d15ed5a9952a9cec"} Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.704025 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-wsc9p" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.706096 4909 scope.go:117] "RemoveContainer" containerID="aad1d773964d01a8539bc400576046879713b710f8f394e8c490217d5f3eb00d" Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.756591 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.765004 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-wsc9p"] Dec 01 10:50:39 crc kubenswrapper[4909]: I1201 10:50:39.766628 4909 scope.go:117] "RemoveContainer" containerID="9f1edb7681b02a45797ae6659b76d203e94036dd3a98c266bbf849e792ab68d8" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.213852 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.222091 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.297125 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts\") pod \"8ddd315c-8e96-40fd-ba04-48bf855ac533\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.297253 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle\") pod \"4556e9cd-36ab-411b-9065-0b74a4c426a5\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.297305 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clcd\" (UniqueName: \"kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd\") pod \"8ddd315c-8e96-40fd-ba04-48bf855ac533\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.297383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data\") pod \"4556e9cd-36ab-411b-9065-0b74a4c426a5\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.305297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts\") pod \"4556e9cd-36ab-411b-9065-0b74a4c426a5\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.305391 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv2pm\" (UniqueName: \"kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm\") pod \"4556e9cd-36ab-411b-9065-0b74a4c426a5\" (UID: \"4556e9cd-36ab-411b-9065-0b74a4c426a5\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.305434 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data\") pod \"8ddd315c-8e96-40fd-ba04-48bf855ac533\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.305491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle\") pod \"8ddd315c-8e96-40fd-ba04-48bf855ac533\" (UID: \"8ddd315c-8e96-40fd-ba04-48bf855ac533\") " Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.306753 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd" (OuterVolumeSpecName: "kube-api-access-6clcd") pod "8ddd315c-8e96-40fd-ba04-48bf855ac533" (UID: "8ddd315c-8e96-40fd-ba04-48bf855ac533"). InnerVolumeSpecName "kube-api-access-6clcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.307304 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts" (OuterVolumeSpecName: "scripts") pod "8ddd315c-8e96-40fd-ba04-48bf855ac533" (UID: "8ddd315c-8e96-40fd-ba04-48bf855ac533"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.309448 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm" (OuterVolumeSpecName: "kube-api-access-vv2pm") pod "4556e9cd-36ab-411b-9065-0b74a4c426a5" (UID: "4556e9cd-36ab-411b-9065-0b74a4c426a5"). InnerVolumeSpecName "kube-api-access-vv2pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.311565 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts" (OuterVolumeSpecName: "scripts") pod "4556e9cd-36ab-411b-9065-0b74a4c426a5" (UID: "4556e9cd-36ab-411b-9065-0b74a4c426a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.330659 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4556e9cd-36ab-411b-9065-0b74a4c426a5" (UID: "4556e9cd-36ab-411b-9065-0b74a4c426a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.342686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data" (OuterVolumeSpecName: "config-data") pod "4556e9cd-36ab-411b-9065-0b74a4c426a5" (UID: "4556e9cd-36ab-411b-9065-0b74a4c426a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.343256 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ddd315c-8e96-40fd-ba04-48bf855ac533" (UID: "8ddd315c-8e96-40fd-ba04-48bf855ac533"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.346056 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data" (OuterVolumeSpecName: "config-data") pod "8ddd315c-8e96-40fd-ba04-48bf855ac533" (UID: "8ddd315c-8e96-40fd-ba04-48bf855ac533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409162 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409203 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409216 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clcd\" (UniqueName: \"kubernetes.io/projected/8ddd315c-8e96-40fd-ba04-48bf855ac533-kube-api-access-6clcd\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409229 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409237 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4556e9cd-36ab-411b-9065-0b74a4c426a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409246 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv2pm\" (UniqueName: \"kubernetes.io/projected/4556e9cd-36ab-411b-9065-0b74a4c426a5-kube-api-access-vv2pm\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409255 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.409264 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddd315c-8e96-40fd-ba04-48bf855ac533-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.718829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9whwl" event={"ID":"4556e9cd-36ab-411b-9065-0b74a4c426a5","Type":"ContainerDied","Data":"9d26e1bac7fc2b0f5f6ef09aaf2f483cc5d2b55ea657bf6c5639fa70f570a472"} Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.718895 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d26e1bac7fc2b0f5f6ef09aaf2f483cc5d2b55ea657bf6c5639fa70f570a472" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.718959 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9whwl" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.725710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bqghh" event={"ID":"8ddd315c-8e96-40fd-ba04-48bf855ac533","Type":"ContainerDied","Data":"a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6"} Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.725767 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09de2c85832dd6f166ecc76dcd32e8dc377c319e073a89a049f760f390248c6" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.725893 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bqghh" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.814939 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:50:40 crc kubenswrapper[4909]: E1201 10:50:40.816299 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4556e9cd-36ab-411b-9065-0b74a4c426a5" containerName="nova-manage" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.816326 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4556e9cd-36ab-411b-9065-0b74a4c426a5" containerName="nova-manage" Dec 01 10:50:40 crc kubenswrapper[4909]: E1201 10:50:40.816342 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="init" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.816348 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="init" Dec 01 10:50:40 crc kubenswrapper[4909]: E1201 10:50:40.816576 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddd315c-8e96-40fd-ba04-48bf855ac533" containerName="nova-cell1-conductor-db-sync" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.816586 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddd315c-8e96-40fd-ba04-48bf855ac533" containerName="nova-cell1-conductor-db-sync" Dec 01 10:50:40 crc kubenswrapper[4909]: E1201 10:50:40.816928 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="dnsmasq-dns" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.816942 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="dnsmasq-dns" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.817162 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" containerName="dnsmasq-dns" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.817200 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddd315c-8e96-40fd-ba04-48bf855ac533" containerName="nova-cell1-conductor-db-sync" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.817216 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4556e9cd-36ab-411b-9065-0b74a4c426a5" containerName="nova-manage" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.819975 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.823206 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.836767 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.917607 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwvr\" (UniqueName: \"kubernetes.io/projected/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-kube-api-access-ccwvr\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.917700 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.917732 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.970964 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.971209 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerName="nova-scheduler-scheduler" containerID="cri-o://79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" gracePeriod=30 Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.987932 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.988350 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-log" containerID="cri-o://128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58" gracePeriod=30 Dec 01 10:50:40 crc kubenswrapper[4909]: I1201 10:50:40.988776 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-api" containerID="cri-o://d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3" gracePeriod=30 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.004353 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.004687 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-log" containerID="cri-o://2f30d210612764c2f454c0c7a932a4406a8b86e4f16433529990a56ac7d87309" gracePeriod=30 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.005146 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-metadata" containerID="cri-o://ee0e246e2aa5f7681950d03ef0bc3765d8607a071ec4f2f64e996a6092815529" gracePeriod=30 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.021264 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.021339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.021454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwvr\" (UniqueName: \"kubernetes.io/projected/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-kube-api-access-ccwvr\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.026887 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.027629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.052744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwvr\" (UniqueName: \"kubernetes.io/projected/825ab14c-14ed-4ca8-a367-15ed10bf1bc9-kube-api-access-ccwvr\") pod \"nova-cell1-conductor-0\" (UID: \"825ab14c-14ed-4ca8-a367-15ed10bf1bc9\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.159392 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.294971 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f87308-e13d-4bc3-89ed-c0ec275c4824" path="/var/lib/kubelet/pods/06f87308-e13d-4bc3-89ed-c0ec275c4824/volumes" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.296315 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.296352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757400 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerID="ee0e246e2aa5f7681950d03ef0bc3765d8607a071ec4f2f64e996a6092815529" exitCode=0 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757461 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerID="2f30d210612764c2f454c0c7a932a4406a8b86e4f16433529990a56ac7d87309" exitCode=143 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerDied","Data":"ee0e246e2aa5f7681950d03ef0bc3765d8607a071ec4f2f64e996a6092815529"} Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerDied","Data":"2f30d210612764c2f454c0c7a932a4406a8b86e4f16433529990a56ac7d87309"} Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757589 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9dd6c772-cf8d-467f-a3c9-f14614060ac1","Type":"ContainerDied","Data":"6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb"} Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.757603 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6207cc34cc75750a2a0cbe1b80c2c073d28a67eb33c292e3e525d9bee27495bb" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.770899 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerID="128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58" exitCode=143 Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.770947 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerDied","Data":"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58"} Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.774909 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.945330 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data\") pod \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.945490 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdv22\" (UniqueName: \"kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22\") pod \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.945657 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs\") pod \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.945736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle\") pod \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.945766 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs\") pod \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\" (UID: \"9dd6c772-cf8d-467f-a3c9-f14614060ac1\") " Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.947207 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs" (OuterVolumeSpecName: "logs") pod "9dd6c772-cf8d-467f-a3c9-f14614060ac1" (UID: "9dd6c772-cf8d-467f-a3c9-f14614060ac1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.977383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22" (OuterVolumeSpecName: "kube-api-access-mdv22") pod "9dd6c772-cf8d-467f-a3c9-f14614060ac1" (UID: "9dd6c772-cf8d-467f-a3c9-f14614060ac1"). InnerVolumeSpecName "kube-api-access-mdv22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:41 crc kubenswrapper[4909]: I1201 10:50:41.997077 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.002605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dd6c772-cf8d-467f-a3c9-f14614060ac1" (UID: "9dd6c772-cf8d-467f-a3c9-f14614060ac1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.004029 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data" (OuterVolumeSpecName: "config-data") pod "9dd6c772-cf8d-467f-a3c9-f14614060ac1" (UID: "9dd6c772-cf8d-467f-a3c9-f14614060ac1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:42 crc kubenswrapper[4909]: W1201 10:50:42.006614 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod825ab14c_14ed_4ca8_a367_15ed10bf1bc9.slice/crio-bb926aab163c836ecf215a07f4073221e643256a777ea1183478bc24a9e7450f WatchSource:0}: Error finding container bb926aab163c836ecf215a07f4073221e643256a777ea1183478bc24a9e7450f: Status 404 returned error can't find the container with id bb926aab163c836ecf215a07f4073221e643256a777ea1183478bc24a9e7450f Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.050668 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.050713 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdv22\" (UniqueName: \"kubernetes.io/projected/9dd6c772-cf8d-467f-a3c9-f14614060ac1-kube-api-access-mdv22\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.050731 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.050746 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd6c772-cf8d-467f-a3c9-f14614060ac1-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.054942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9dd6c772-cf8d-467f-a3c9-f14614060ac1" (UID: "9dd6c772-cf8d-467f-a3c9-f14614060ac1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.153765 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dd6c772-cf8d-467f-a3c9-f14614060ac1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.786759 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"825ab14c-14ed-4ca8-a367-15ed10bf1bc9","Type":"ContainerStarted","Data":"c7ed1a3bb06b9cd04cb0767302a11d1a603fb1b72107926f5b34c021bdc8ff25"} Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.787478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"825ab14c-14ed-4ca8-a367-15ed10bf1bc9","Type":"ContainerStarted","Data":"bb926aab163c836ecf215a07f4073221e643256a777ea1183478bc24a9e7450f"} Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.786792 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.803342 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.832255 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.83223716 podStartE2EDuration="2.83223716s" podCreationTimestamp="2025-12-01 10:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:42.825342766 +0000 UTC m=+1160.059813684" watchObservedRunningTime="2025-12-01 10:50:42.83223716 +0000 UTC m=+1160.066708048" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.866313 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.877260 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.887884 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:42 crc kubenswrapper[4909]: E1201 10:50:42.888499 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-metadata" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.888526 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-metadata" Dec 01 10:50:42 crc kubenswrapper[4909]: E1201 10:50:42.888542 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-log" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.888550 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-log" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.888847 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-metadata" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.888884 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" containerName="nova-metadata-log" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.890350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.893383 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.895686 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.896920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.976048 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7fx\" (UniqueName: \"kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.976137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.976172 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.976644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:42 crc kubenswrapper[4909]: I1201 10:50:42.976828 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.079716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.079792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.079865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.079941 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.080026 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7fx\" (UniqueName: \"kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.080569 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.085961 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.090600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.091335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.099754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7fx\" (UniqueName: \"kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx\") pod \"nova-metadata-0\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: E1201 10:50:43.162926 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:50:43 crc kubenswrapper[4909]: E1201 10:50:43.165027 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:50:43 crc kubenswrapper[4909]: E1201 10:50:43.174120 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:50:43 crc kubenswrapper[4909]: E1201 10:50:43.174181 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerName="nova-scheduler-scheduler" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.219018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.271074 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd6c772-cf8d-467f-a3c9-f14614060ac1" path="/var/lib/kubelet/pods/9dd6c772-cf8d-467f-a3c9-f14614060ac1/volumes" Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.700980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:50:43 crc kubenswrapper[4909]: I1201 10:50:43.801170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerStarted","Data":"92129a45fe76a976d0ab38f668df96709b14bff5d58f769f8314f0c5bd632f8a"} Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.634717 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.811500 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerID="d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3" exitCode=0 Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.811607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerDied","Data":"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3"} Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.811614 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.811650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a43e441-37c5-4ada-9bb8-f610dd18a5f9","Type":"ContainerDied","Data":"2007e53a3e3d63407478245d538c7ac48e3e57d5de43bd788e95fb0327f1ac8f"} Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.811674 4909 scope.go:117] "RemoveContainer" containerID="d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.814365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerStarted","Data":"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43"} Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.814394 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerStarted","Data":"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9"} Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.820235 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs\") pod \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.820365 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jpzz\" (UniqueName: \"kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz\") pod \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.820490 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle\") pod \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.820716 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data\") pod \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\" (UID: \"9a43e441-37c5-4ada-9bb8-f610dd18a5f9\") " Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.827093 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs" (OuterVolumeSpecName: "logs") pod "9a43e441-37c5-4ada-9bb8-f610dd18a5f9" (UID: "9a43e441-37c5-4ada-9bb8-f610dd18a5f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.832565 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz" (OuterVolumeSpecName: "kube-api-access-5jpzz") pod "9a43e441-37c5-4ada-9bb8-f610dd18a5f9" (UID: "9a43e441-37c5-4ada-9bb8-f610dd18a5f9"). InnerVolumeSpecName "kube-api-access-5jpzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.842089 4909 scope.go:117] "RemoveContainer" containerID="128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.845726 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.845701009 podStartE2EDuration="2.845701009s" podCreationTimestamp="2025-12-01 10:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:44.836480392 +0000 UTC m=+1162.070951300" watchObservedRunningTime="2025-12-01 10:50:44.845701009 +0000 UTC m=+1162.080171907" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.857455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a43e441-37c5-4ada-9bb8-f610dd18a5f9" (UID: "9a43e441-37c5-4ada-9bb8-f610dd18a5f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.866263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data" (OuterVolumeSpecName: "config-data") pod "9a43e441-37c5-4ada-9bb8-f610dd18a5f9" (UID: "9a43e441-37c5-4ada-9bb8-f610dd18a5f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.923976 4909 scope.go:117] "RemoveContainer" containerID="d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.925817 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jpzz\" (UniqueName: \"kubernetes.io/projected/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-kube-api-access-5jpzz\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.925845 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.925861 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.925891 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a43e441-37c5-4ada-9bb8-f610dd18a5f9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:44 crc kubenswrapper[4909]: E1201 10:50:44.926941 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3\": container with ID starting with d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3 not found: ID does not exist" containerID="d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.927006 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3"} err="failed to get container status \"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3\": rpc error: code = NotFound desc = could not find container \"d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3\": container with ID starting with d74a13493d137e296ab28e41b51c2b340251a8e9df2bc934a7e4ec011b78a7e3 not found: ID does not exist" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.927038 4909 scope.go:117] "RemoveContainer" containerID="128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58" Dec 01 10:50:44 crc kubenswrapper[4909]: E1201 10:50:44.927449 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58\": container with ID starting with 128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58 not found: ID does not exist" containerID="128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58" Dec 01 10:50:44 crc kubenswrapper[4909]: I1201 10:50:44.927479 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58"} err="failed to get container status \"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58\": rpc error: code = NotFound desc = could not find container \"128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58\": container with ID starting with 128f7a7b48e854c2eb1d60c78f8b94137e95d0688068affb37014abeb6b28f58 not found: ID does not exist" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.157107 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.180860 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.193595 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:45 crc kubenswrapper[4909]: E1201 10:50:45.194257 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-api" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.194287 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-api" Dec 01 10:50:45 crc kubenswrapper[4909]: E1201 10:50:45.194321 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-log" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.194331 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-log" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.194592 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-api" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.194633 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" containerName="nova-api-log" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.195999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.206219 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.215778 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.232821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qh9\" (UniqueName: \"kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.232911 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.232933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.232965 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.273558 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a43e441-37c5-4ada-9bb8-f610dd18a5f9" path="/var/lib/kubelet/pods/9a43e441-37c5-4ada-9bb8-f610dd18a5f9/volumes" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.334848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qh9\" (UniqueName: \"kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.335006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.335913 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.335991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.336632 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.343576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.356118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.362392 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qh9\" (UniqueName: \"kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9\") pod \"nova-api-0\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.535906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.832926 4909 generic.go:334] "Generic (PLEG): container finished" podID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerID="79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" exitCode=0 Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.832964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc","Type":"ContainerDied","Data":"79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398"} Dec 01 10:50:45 crc kubenswrapper[4909]: I1201 10:50:45.997349 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.235846 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.357487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmx5g\" (UniqueName: \"kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g\") pod \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.357804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle\") pod \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.357937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data\") pod \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\" (UID: \"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc\") " Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.362191 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g" (OuterVolumeSpecName: "kube-api-access-wmx5g") pod "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" (UID: "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc"). InnerVolumeSpecName "kube-api-access-wmx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.385953 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data" (OuterVolumeSpecName: "config-data") pod "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" (UID: "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.393053 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" (UID: "f9bcf0a4-b949-4dc4-8846-7bcefd6961bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.460464 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.460859 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.460886 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmx5g\" (UniqueName: \"kubernetes.io/projected/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc-kube-api-access-wmx5g\") on node \"crc\" DevicePath \"\"" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.846049 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerStarted","Data":"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd"} Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.846109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerStarted","Data":"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f"} Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.846123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerStarted","Data":"a464fe1ae5d8b3b3645e9123af109fd46150e8f3484a793073d35d34fb6ab80b"} Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.848691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9bcf0a4-b949-4dc4-8846-7bcefd6961bc","Type":"ContainerDied","Data":"a90f3e957679e4d63e36bbd8ee9d05524afbf9d6f5d09c45a8ba7174561030da"} Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.848730 4909 scope.go:117] "RemoveContainer" containerID="79d74f4d034728278ddf1bbcb348d7412f4b09e11b856263ef977121dc2bc398" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.848839 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.898394 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.898356728 podStartE2EDuration="1.898356728s" podCreationTimestamp="2025-12-01 10:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:46.873186484 +0000 UTC m=+1164.107657382" watchObservedRunningTime="2025-12-01 10:50:46.898356728 +0000 UTC m=+1164.132827626" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.949462 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.963002 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.972649 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:46 crc kubenswrapper[4909]: E1201 10:50:46.973229 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerName="nova-scheduler-scheduler" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.973253 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerName="nova-scheduler-scheduler" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.973474 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" containerName="nova-scheduler-scheduler" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.974407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.976766 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:50:46 crc kubenswrapper[4909]: I1201 10:50:46.980864 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.077819 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.077930 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkfg\" (UniqueName: \"kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.077963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.179709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.179783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkfg\" (UniqueName: \"kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.179809 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.188695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.188732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.197213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkfg\" (UniqueName: \"kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg\") pod \"nova-scheduler-0\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.268549 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bcf0a4-b949-4dc4-8846-7bcefd6961bc" path="/var/lib/kubelet/pods/f9bcf0a4-b949-4dc4-8846-7bcefd6961bc/volumes" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.295400 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.766782 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:50:47 crc kubenswrapper[4909]: W1201 10:50:47.772918 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb421bbb3_25fb_4eb5_ac53_a03f2f941b6a.slice/crio-1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215 WatchSource:0}: Error finding container 1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215: Status 404 returned error can't find the container with id 1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215 Dec 01 10:50:47 crc kubenswrapper[4909]: I1201 10:50:47.860204 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a","Type":"ContainerStarted","Data":"1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215"} Dec 01 10:50:48 crc kubenswrapper[4909]: I1201 10:50:48.219556 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:50:48 crc kubenswrapper[4909]: I1201 10:50:48.220023 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:50:48 crc kubenswrapper[4909]: I1201 10:50:48.868280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a","Type":"ContainerStarted","Data":"f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960"} Dec 01 10:50:48 crc kubenswrapper[4909]: I1201 10:50:48.902456 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.902431354 podStartE2EDuration="2.902431354s" podCreationTimestamp="2025-12-01 10:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:50:48.900160913 +0000 UTC m=+1166.134631811" watchObservedRunningTime="2025-12-01 10:50:48.902431354 +0000 UTC m=+1166.136902252" Dec 01 10:50:51 crc kubenswrapper[4909]: I1201 10:50:51.187447 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 10:50:52 crc kubenswrapper[4909]: I1201 10:50:52.295973 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:50:52 crc kubenswrapper[4909]: I1201 10:50:52.556514 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:50:53 crc kubenswrapper[4909]: I1201 10:50:53.219617 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:50:53 crc kubenswrapper[4909]: I1201 10:50:53.219678 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:50:54 crc kubenswrapper[4909]: I1201 10:50:54.232227 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:54 crc kubenswrapper[4909]: I1201 10:50:54.232227 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:55 crc kubenswrapper[4909]: I1201 10:50:55.536394 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:50:55 crc kubenswrapper[4909]: I1201 10:50:55.536455 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:50:56 crc kubenswrapper[4909]: I1201 10:50:56.619151 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:56 crc kubenswrapper[4909]: I1201 10:50:56.619151 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:50:57 crc kubenswrapper[4909]: I1201 10:50:57.296042 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:50:57 crc kubenswrapper[4909]: I1201 10:50:57.323802 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:50:57 crc kubenswrapper[4909]: I1201 10:50:57.985589 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:51:03 crc kubenswrapper[4909]: I1201 10:51:03.239905 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:51:03 crc kubenswrapper[4909]: I1201 10:51:03.240478 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:51:03 crc kubenswrapper[4909]: I1201 10:51:03.246496 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:51:03 crc kubenswrapper[4909]: I1201 10:51:03.249641 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.049616 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf02fa2a-a364-448c-be13-3e2ec28e591f" containerID="e2312592bf88aaf0b98d60601b816a56d1ee3d31d786bfd6630686b55df85c37" exitCode=137 Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.049692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf02fa2a-a364-448c-be13-3e2ec28e591f","Type":"ContainerDied","Data":"e2312592bf88aaf0b98d60601b816a56d1ee3d31d786bfd6630686b55df85c37"} Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.440772 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.534929 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle\") pod \"cf02fa2a-a364-448c-be13-3e2ec28e591f\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.535123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data\") pod \"cf02fa2a-a364-448c-be13-3e2ec28e591f\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.535296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mqkl\" (UniqueName: \"kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl\") pod \"cf02fa2a-a364-448c-be13-3e2ec28e591f\" (UID: \"cf02fa2a-a364-448c-be13-3e2ec28e591f\") " Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.549102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl" (OuterVolumeSpecName: "kube-api-access-5mqkl") pod "cf02fa2a-a364-448c-be13-3e2ec28e591f" (UID: "cf02fa2a-a364-448c-be13-3e2ec28e591f"). InnerVolumeSpecName "kube-api-access-5mqkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.563866 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf02fa2a-a364-448c-be13-3e2ec28e591f" (UID: "cf02fa2a-a364-448c-be13-3e2ec28e591f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.570752 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data" (OuterVolumeSpecName: "config-data") pod "cf02fa2a-a364-448c-be13-3e2ec28e591f" (UID: "cf02fa2a-a364-448c-be13-3e2ec28e591f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.637072 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mqkl\" (UniqueName: \"kubernetes.io/projected/cf02fa2a-a364-448c-be13-3e2ec28e591f-kube-api-access-5mqkl\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.637115 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:04 crc kubenswrapper[4909]: I1201 10:51:04.637128 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf02fa2a-a364-448c-be13-3e2ec28e591f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.061672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf02fa2a-a364-448c-be13-3e2ec28e591f","Type":"ContainerDied","Data":"9ed66554aa3d01e06d43d78e074d2a40d841a92d43ba23420149e5b71690b3c1"} Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.061718 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.061756 4909 scope.go:117] "RemoveContainer" containerID="e2312592bf88aaf0b98d60601b816a56d1ee3d31d786bfd6630686b55df85c37" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.108082 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.124222 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.135198 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:51:05 crc kubenswrapper[4909]: E1201 10:51:05.135806 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf02fa2a-a364-448c-be13-3e2ec28e591f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.135831 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf02fa2a-a364-448c-be13-3e2ec28e591f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.136087 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf02fa2a-a364-448c-be13-3e2ec28e591f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.136995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.141341 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.141527 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.142186 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.149564 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.249068 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.249451 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.249513 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkg62\" (UniqueName: \"kubernetes.io/projected/e1abbfc5-9c24-418a-be94-4a74fd32e687-kube-api-access-nkg62\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.249588 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.249651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.268293 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf02fa2a-a364-448c-be13-3e2ec28e591f" path="/var/lib/kubelet/pods/cf02fa2a-a364-448c-be13-3e2ec28e591f/volumes" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.350976 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.351093 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.351159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.351195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.351264 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkg62\" (UniqueName: \"kubernetes.io/projected/e1abbfc5-9c24-418a-be94-4a74fd32e687-kube-api-access-nkg62\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.355833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.356160 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.356549 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.356709 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1abbfc5-9c24-418a-be94-4a74fd32e687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.373293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkg62\" (UniqueName: \"kubernetes.io/projected/e1abbfc5-9c24-418a-be94-4a74fd32e687-kube-api-access-nkg62\") pod \"nova-cell1-novncproxy-0\" (UID: \"e1abbfc5-9c24-418a-be94-4a74fd32e687\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.457760 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.541830 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.542819 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.545432 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.546409 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:51:05 crc kubenswrapper[4909]: I1201 10:51:05.945354 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.096984 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e1abbfc5-9c24-418a-be94-4a74fd32e687","Type":"ContainerStarted","Data":"6cb0c9859a7ee843d169a77cc45691ce16e953c5959d78ee79fff0e83c1fc9ce"} Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.100515 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.107381 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.399780 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.401604 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.434783 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.519736 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29cl8\" (UniqueName: \"kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.519794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.519841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.519980 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.520003 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.624289 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29cl8\" (UniqueName: \"kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.624377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.624439 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.624496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.624524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.626449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.626463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.627049 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.627353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.665436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29cl8\" (UniqueName: \"kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8\") pod \"dnsmasq-dns-5b856c5697-94dvb\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:06 crc kubenswrapper[4909]: I1201 10:51:06.763632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:07 crc kubenswrapper[4909]: I1201 10:51:07.109455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e1abbfc5-9c24-418a-be94-4a74fd32e687","Type":"ContainerStarted","Data":"f433b0bbbeab37d6f6936fd0752bcf917c93225b1dbb631e88f3001d9316ba11"} Dec 01 10:51:07 crc kubenswrapper[4909]: I1201 10:51:07.148482 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.148452199 podStartE2EDuration="2.148452199s" podCreationTimestamp="2025-12-01 10:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:07.130985157 +0000 UTC m=+1184.365456075" watchObservedRunningTime="2025-12-01 10:51:07.148452199 +0000 UTC m=+1184.382923097" Dec 01 10:51:07 crc kubenswrapper[4909]: I1201 10:51:07.302713 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:51:07 crc kubenswrapper[4909]: W1201 10:51:07.305009 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63a9d1b0_6050_4bf6_b247_aea03752927e.slice/crio-b4609f705960e4c94f463f1e736d2d5c99644409d49011bccd0362d16789e236 WatchSource:0}: Error finding container b4609f705960e4c94f463f1e736d2d5c99644409d49011bccd0362d16789e236: Status 404 returned error can't find the container with id b4609f705960e4c94f463f1e736d2d5c99644409d49011bccd0362d16789e236 Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.122137 4909 generic.go:334] "Generic (PLEG): container finished" podID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerID="32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99" exitCode=0 Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.122216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" event={"ID":"63a9d1b0-6050-4bf6-b247-aea03752927e","Type":"ContainerDied","Data":"32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99"} Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.123086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" event={"ID":"63a9d1b0-6050-4bf6-b247-aea03752927e","Type":"ContainerStarted","Data":"b4609f705960e4c94f463f1e736d2d5c99644409d49011bccd0362d16789e236"} Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.751926 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.752943 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="proxy-httpd" containerID="cri-o://9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc" gracePeriod=30 Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.753018 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="sg-core" containerID="cri-o://c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d" gracePeriod=30 Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.753127 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-notification-agent" containerID="cri-o://1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a" gracePeriod=30 Dec 01 10:51:08 crc kubenswrapper[4909]: I1201 10:51:08.752857 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-central-agent" containerID="cri-o://01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a" gracePeriod=30 Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.098675 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.133432 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" event={"ID":"63a9d1b0-6050-4bf6-b247-aea03752927e","Type":"ContainerStarted","Data":"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2"} Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136075 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerID="9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc" exitCode=0 Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136146 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerID="c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d" exitCode=2 Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerDied","Data":"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc"} Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerDied","Data":"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d"} Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136443 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-log" containerID="cri-o://55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f" gracePeriod=30 Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.136540 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-api" containerID="cri-o://ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd" gracePeriod=30 Dec 01 10:51:09 crc kubenswrapper[4909]: I1201 10:51:09.282381 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" podStartSLOduration=3.282364984 podStartE2EDuration="3.282364984s" podCreationTimestamp="2025-12-01 10:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:09.255557551 +0000 UTC m=+1186.490028449" watchObservedRunningTime="2025-12-01 10:51:09.282364984 +0000 UTC m=+1186.516835882" Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.149629 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerID="01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a" exitCode=0 Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.149718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerDied","Data":"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a"} Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.153137 4909 generic.go:334] "Generic (PLEG): container finished" podID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerID="55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f" exitCode=143 Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.153216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerDied","Data":"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f"} Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.153691 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:10 crc kubenswrapper[4909]: I1201 10:51:10.459432 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.071308 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.138934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.139556 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqpqd\" (UniqueName: \"kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.139631 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.139679 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.139804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.140047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.140085 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.140123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd\") pod \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\" (UID: \"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee\") " Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.141543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.141651 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.152689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts" (OuterVolumeSpecName: "scripts") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.163447 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd" (OuterVolumeSpecName: "kube-api-access-fqpqd") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "kube-api-access-fqpqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.195648 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerID="1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a" exitCode=0 Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.196201 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerDied","Data":"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a"} Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.196371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee","Type":"ContainerDied","Data":"cb15814971d35c3c3877fd0f16d265edfaf16938755ed2833af50e93b05634ea"} Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.196563 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.196770 4909 scope.go:117] "RemoveContainer" containerID="9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.217576 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.239446 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249101 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249140 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249153 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249163 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249173 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqpqd\" (UniqueName: \"kubernetes.io/projected/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-kube-api-access-fqpqd\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.249182 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.294777 4909 scope.go:117] "RemoveContainer" containerID="c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.301972 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.317541 4909 scope.go:117] "RemoveContainer" containerID="1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.331341 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data" (OuterVolumeSpecName: "config-data") pod "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" (UID: "3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.344248 4909 scope.go:117] "RemoveContainer" containerID="01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.354532 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.354566 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.370509 4909 scope.go:117] "RemoveContainer" containerID="9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.371224 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc\": container with ID starting with 9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc not found: ID does not exist" containerID="9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.371274 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc"} err="failed to get container status \"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc\": rpc error: code = NotFound desc = could not find container \"9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc\": container with ID starting with 9a1b0540843816fa4ca68108f262de431803a2d592aacac4dc34bfef65cda9cc not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.371306 4909 scope.go:117] "RemoveContainer" containerID="c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.371665 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d\": container with ID starting with c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d not found: ID does not exist" containerID="c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.371690 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d"} err="failed to get container status \"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d\": rpc error: code = NotFound desc = could not find container \"c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d\": container with ID starting with c4e291e17088f3bd6999a1f46b71ebae3888a1b39ba4ce330ed35bbf7715990d not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.371709 4909 scope.go:117] "RemoveContainer" containerID="1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.372238 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a\": container with ID starting with 1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a not found: ID does not exist" containerID="1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.372269 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a"} err="failed to get container status \"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a\": rpc error: code = NotFound desc = could not find container \"1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a\": container with ID starting with 1221e9a1d560e73827baa95fefe2155097b4a605a0dbac521dd2cd90a9db886a not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.372288 4909 scope.go:117] "RemoveContainer" containerID="01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.372569 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a\": container with ID starting with 01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a not found: ID does not exist" containerID="01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.372608 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a"} err="failed to get container status \"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a\": rpc error: code = NotFound desc = could not find container \"01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a\": container with ID starting with 01ec51ae59c583b44131b185cf83cff6215431c90f62f5903412a2fa0a1a089a not found: ID does not exist" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.556955 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.575438 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.597397 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.597938 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="proxy-httpd" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.597952 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="proxy-httpd" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.597960 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-central-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.597967 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-central-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.597974 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-notification-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.597980 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-notification-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: E1201 10:51:11.597993 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="sg-core" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.597999 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="sg-core" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.598169 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="proxy-httpd" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.598184 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-central-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.598203 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="ceilometer-notification-agent" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.598211 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" containerName="sg-core" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.599985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.600537 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.603432 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.603615 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.603743 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663502 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-scripts\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663528 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79df\" (UniqueName: \"kubernetes.io/projected/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-kube-api-access-h79df\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-run-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-log-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.663615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-config-data\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.765992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-run-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766041 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h79df\" (UniqueName: \"kubernetes.io/projected/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-kube-api-access-h79df\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-log-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766100 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-config-data\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.766278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-scripts\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.769736 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-run-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.774258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-log-httpd\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.776278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-config-data\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.776928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.789903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.790238 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.790552 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-scripts\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.795001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79df\" (UniqueName: \"kubernetes.io/projected/98e4c943-29ab-4bb6-ab5d-a63b167e6e2f-kube-api-access-h79df\") pod \"ceilometer-0\" (UID: \"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f\") " pod="openstack/ceilometer-0" Dec 01 10:51:11 crc kubenswrapper[4909]: I1201 10:51:11.940893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:51:12 crc kubenswrapper[4909]: W1201 10:51:12.318549 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e4c943_29ab_4bb6_ab5d_a63b167e6e2f.slice/crio-bf95d029e14bd21badeac1236257c54f82874ede0b27e683cca7ec5ad36cf76b WatchSource:0}: Error finding container bf95d029e14bd21badeac1236257c54f82874ede0b27e683cca7ec5ad36cf76b: Status 404 returned error can't find the container with id bf95d029e14bd21badeac1236257c54f82874ede0b27e683cca7ec5ad36cf76b Dec 01 10:51:12 crc kubenswrapper[4909]: I1201 10:51:12.323124 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.176027 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.203711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle\") pod \"0484e4ef-5031-4901-bc42-976828d7ee3b\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.203786 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qh9\" (UniqueName: \"kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9\") pod \"0484e4ef-5031-4901-bc42-976828d7ee3b\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.203841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs\") pod \"0484e4ef-5031-4901-bc42-976828d7ee3b\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.204007 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data\") pod \"0484e4ef-5031-4901-bc42-976828d7ee3b\" (UID: \"0484e4ef-5031-4901-bc42-976828d7ee3b\") " Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.206603 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs" (OuterVolumeSpecName: "logs") pod "0484e4ef-5031-4901-bc42-976828d7ee3b" (UID: "0484e4ef-5031-4901-bc42-976828d7ee3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.228007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9" (OuterVolumeSpecName: "kube-api-access-p2qh9") pod "0484e4ef-5031-4901-bc42-976828d7ee3b" (UID: "0484e4ef-5031-4901-bc42-976828d7ee3b"). InnerVolumeSpecName "kube-api-access-p2qh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.241140 4909 generic.go:334] "Generic (PLEG): container finished" podID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerID="ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd" exitCode=0 Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.241237 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerDied","Data":"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd"} Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.241266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0484e4ef-5031-4901-bc42-976828d7ee3b","Type":"ContainerDied","Data":"a464fe1ae5d8b3b3645e9123af109fd46150e8f3484a793073d35d34fb6ab80b"} Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.241283 4909 scope.go:117] "RemoveContainer" containerID="ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.241462 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.245048 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f","Type":"ContainerStarted","Data":"bf95d029e14bd21badeac1236257c54f82874ede0b27e683cca7ec5ad36cf76b"} Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.273083 4909 scope.go:117] "RemoveContainer" containerID="55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.290604 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee" path="/var/lib/kubelet/pods/3d6fd8c0-7164-43fe-b1c6-33cd35ac8dee/volumes" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.318342 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0484e4ef-5031-4901-bc42-976828d7ee3b-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.319268 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qh9\" (UniqueName: \"kubernetes.io/projected/0484e4ef-5031-4901-bc42-976828d7ee3b-kube-api-access-p2qh9\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.326867 4909 scope.go:117] "RemoveContainer" containerID="ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd" Dec 01 10:51:13 crc kubenswrapper[4909]: E1201 10:51:13.327524 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd\": container with ID starting with ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd not found: ID does not exist" containerID="ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.327553 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd"} err="failed to get container status \"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd\": rpc error: code = NotFound desc = could not find container \"ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd\": container with ID starting with ad4992957cdd0f34caac4cca59be4d02c35e93a3356b3e0b2a164884c87221cd not found: ID does not exist" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.327580 4909 scope.go:117] "RemoveContainer" containerID="55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f" Dec 01 10:51:13 crc kubenswrapper[4909]: E1201 10:51:13.327816 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f\": container with ID starting with 55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f not found: ID does not exist" containerID="55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.327834 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f"} err="failed to get container status \"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f\": rpc error: code = NotFound desc = could not find container \"55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f\": container with ID starting with 55ddc2932143ac83f01d5dcb0a7eb95bbd3f3520721d67fcf8ae7a24940cb80f not found: ID does not exist" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.338161 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data" (OuterVolumeSpecName: "config-data") pod "0484e4ef-5031-4901-bc42-976828d7ee3b" (UID: "0484e4ef-5031-4901-bc42-976828d7ee3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.342832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0484e4ef-5031-4901-bc42-976828d7ee3b" (UID: "0484e4ef-5031-4901-bc42-976828d7ee3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.421821 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.421856 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0484e4ef-5031-4901-bc42-976828d7ee3b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.582427 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.593588 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.612934 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:13 crc kubenswrapper[4909]: E1201 10:51:13.613497 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-api" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.613520 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-api" Dec 01 10:51:13 crc kubenswrapper[4909]: E1201 10:51:13.613551 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-log" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.613562 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-log" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.613807 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-api" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.613846 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" containerName="nova-api-log" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.615231 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.618084 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.618143 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.618592 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626458 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626636 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8mvk\" (UniqueName: \"kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626700 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.626795 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.627061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729630 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8mvk\" (UniqueName: \"kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729670 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729708 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.729914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.730383 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.734844 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.735290 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.736532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.750806 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.754384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8mvk\" (UniqueName: \"kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk\") pod \"nova-api-0\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " pod="openstack/nova-api-0" Dec 01 10:51:13 crc kubenswrapper[4909]: I1201 10:51:13.941523 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:14 crc kubenswrapper[4909]: I1201 10:51:14.262487 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f","Type":"ContainerStarted","Data":"ff2170d60a005bff5562c7d890972d4dac5df457784b5b45cd1d1007cdcd6fc0"} Dec 01 10:51:14 crc kubenswrapper[4909]: W1201 10:51:14.464299 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e80240d_74a5_47b9_99f8_aee705908f36.slice/crio-43c0fae89f4a10503119833cd7d4d285113f3db25c0fdae6df7fdc49efed06d6 WatchSource:0}: Error finding container 43c0fae89f4a10503119833cd7d4d285113f3db25c0fdae6df7fdc49efed06d6: Status 404 returned error can't find the container with id 43c0fae89f4a10503119833cd7d4d285113f3db25c0fdae6df7fdc49efed06d6 Dec 01 10:51:14 crc kubenswrapper[4909]: I1201 10:51:14.464614 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.288332 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0484e4ef-5031-4901-bc42-976828d7ee3b" path="/var/lib/kubelet/pods/0484e4ef-5031-4901-bc42-976828d7ee3b/volumes" Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.295223 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerStarted","Data":"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32"} Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.295295 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerStarted","Data":"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318"} Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.295310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerStarted","Data":"43c0fae89f4a10503119833cd7d4d285113f3db25c0fdae6df7fdc49efed06d6"} Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.299346 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f","Type":"ContainerStarted","Data":"16453b4fc93ea5bfaa756715e1d47515ba9eef7e851e9560ce82f281b074b7ec"} Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.299415 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f","Type":"ContainerStarted","Data":"70010063ccc10c840b5d7d0750482a9ccd62006aaa25ae1ef27a06f082a204ef"} Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.323111 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3230936 podStartE2EDuration="2.3230936s" podCreationTimestamp="2025-12-01 10:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:15.320283029 +0000 UTC m=+1192.554753927" watchObservedRunningTime="2025-12-01 10:51:15.3230936 +0000 UTC m=+1192.557564498" Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.459990 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:15 crc kubenswrapper[4909]: I1201 10:51:15.478001 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.329249 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.491088 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q8n7t"] Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.492872 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.497465 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.499248 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.506044 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8n7t"] Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.688595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s948v\" (UniqueName: \"kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.688652 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.689183 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.690132 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.765947 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.791569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s948v\" (UniqueName: \"kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.791982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.792265 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.792530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.801164 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.801465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.806888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.833959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s948v\" (UniqueName: \"kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v\") pod \"nova-cell1-cell-mapping-q8n7t\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.855449 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:51:16 crc kubenswrapper[4909]: I1201 10:51:16.855681 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="dnsmasq-dns" containerID="cri-o://a26ae3f80b5c864852d35d53a6296497e0fc4ffc4f14c21a11ce96126294dd1d" gracePeriod=10 Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.121534 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.330666 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"98e4c943-29ab-4bb6-ab5d-a63b167e6e2f","Type":"ContainerStarted","Data":"e59a16eb455a53924317958ec2c151da256d6636875dd01a9f40779bbfdfe1ed"} Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.331329 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.338660 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerID="a26ae3f80b5c864852d35d53a6296497e0fc4ffc4f14c21a11ce96126294dd1d" exitCode=0 Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.338730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" event={"ID":"2e8e06da-87e5-467f-8938-0a349513a0c8","Type":"ContainerDied","Data":"a26ae3f80b5c864852d35d53a6296497e0fc4ffc4f14c21a11ce96126294dd1d"} Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.355385 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.365397 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.732727874 podStartE2EDuration="6.365371527s" podCreationTimestamp="2025-12-01 10:51:11 +0000 UTC" firstStartedPulling="2025-12-01 10:51:12.322252604 +0000 UTC m=+1189.556723502" lastFinishedPulling="2025-12-01 10:51:16.954896257 +0000 UTC m=+1194.189367155" observedRunningTime="2025-12-01 10:51:17.364617924 +0000 UTC m=+1194.599088822" watchObservedRunningTime="2025-12-01 10:51:17.365371527 +0000 UTC m=+1194.599842425" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.411173 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc\") pod \"2e8e06da-87e5-467f-8938-0a349513a0c8\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.411248 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb\") pod \"2e8e06da-87e5-467f-8938-0a349513a0c8\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.411316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmprk\" (UniqueName: \"kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk\") pod \"2e8e06da-87e5-467f-8938-0a349513a0c8\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.411354 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb\") pod \"2e8e06da-87e5-467f-8938-0a349513a0c8\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.413568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config\") pod \"2e8e06da-87e5-467f-8938-0a349513a0c8\" (UID: \"2e8e06da-87e5-467f-8938-0a349513a0c8\") " Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.420972 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk" (OuterVolumeSpecName: "kube-api-access-fmprk") pod "2e8e06da-87e5-467f-8938-0a349513a0c8" (UID: "2e8e06da-87e5-467f-8938-0a349513a0c8"). InnerVolumeSpecName "kube-api-access-fmprk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.471763 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e8e06da-87e5-467f-8938-0a349513a0c8" (UID: "2e8e06da-87e5-467f-8938-0a349513a0c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.474162 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e8e06da-87e5-467f-8938-0a349513a0c8" (UID: "2e8e06da-87e5-467f-8938-0a349513a0c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.502700 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e8e06da-87e5-467f-8938-0a349513a0c8" (UID: "2e8e06da-87e5-467f-8938-0a349513a0c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.509335 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config" (OuterVolumeSpecName: "config") pod "2e8e06da-87e5-467f-8938-0a349513a0c8" (UID: "2e8e06da-87e5-467f-8938-0a349513a0c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.518782 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.518831 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.518847 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmprk\" (UniqueName: \"kubernetes.io/projected/2e8e06da-87e5-467f-8938-0a349513a0c8-kube-api-access-fmprk\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.518858 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.518868 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8e06da-87e5-467f-8938-0a349513a0c8-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:17 crc kubenswrapper[4909]: I1201 10:51:17.685325 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8n7t"] Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.350612 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" event={"ID":"2e8e06da-87e5-467f-8938-0a349513a0c8","Type":"ContainerDied","Data":"2850e5f1684993862511b6fcb2e986ac2f545c8c74b1f7382b22391bff450077"} Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.350639 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-qzhsz" Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.351138 4909 scope.go:117] "RemoveContainer" containerID="a26ae3f80b5c864852d35d53a6296497e0fc4ffc4f14c21a11ce96126294dd1d" Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.365545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8n7t" event={"ID":"99316486-3c01-47ab-8924-cee49da4e1b4","Type":"ContainerStarted","Data":"bd9dd739f1153b614240161421f81051fcb6fb70d83f3b38f8553f249952f701"} Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.365605 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8n7t" event={"ID":"99316486-3c01-47ab-8924-cee49da4e1b4","Type":"ContainerStarted","Data":"5af6b501ee1b6adffdd037bccb5b587df3d33f7505da0153b8d9cd3edd6c0efa"} Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.399844 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q8n7t" podStartSLOduration=2.399818795 podStartE2EDuration="2.399818795s" podCreationTimestamp="2025-12-01 10:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:18.396624512 +0000 UTC m=+1195.631095430" watchObservedRunningTime="2025-12-01 10:51:18.399818795 +0000 UTC m=+1195.634289693" Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.404435 4909 scope.go:117] "RemoveContainer" containerID="71f0ebd1e2f44c31728aa1f855063237d4dd258bf5ca23184ca4f5ebfc7b3532" Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.430134 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:51:18 crc kubenswrapper[4909]: I1201 10:51:18.440566 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-qzhsz"] Dec 01 10:51:19 crc kubenswrapper[4909]: I1201 10:51:19.273615 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" path="/var/lib/kubelet/pods/2e8e06da-87e5-467f-8938-0a349513a0c8/volumes" Dec 01 10:51:23 crc kubenswrapper[4909]: I1201 10:51:23.422202 4909 generic.go:334] "Generic (PLEG): container finished" podID="99316486-3c01-47ab-8924-cee49da4e1b4" containerID="bd9dd739f1153b614240161421f81051fcb6fb70d83f3b38f8553f249952f701" exitCode=0 Dec 01 10:51:23 crc kubenswrapper[4909]: I1201 10:51:23.422284 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8n7t" event={"ID":"99316486-3c01-47ab-8924-cee49da4e1b4","Type":"ContainerDied","Data":"bd9dd739f1153b614240161421f81051fcb6fb70d83f3b38f8553f249952f701"} Dec 01 10:51:23 crc kubenswrapper[4909]: I1201 10:51:23.942573 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:51:23 crc kubenswrapper[4909]: I1201 10:51:23.943082 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.788917 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.960174 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.960191 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.974401 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle\") pod \"99316486-3c01-47ab-8924-cee49da4e1b4\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.974688 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s948v\" (UniqueName: \"kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v\") pod \"99316486-3c01-47ab-8924-cee49da4e1b4\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.974772 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts\") pod \"99316486-3c01-47ab-8924-cee49da4e1b4\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.975010 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data\") pod \"99316486-3c01-47ab-8924-cee49da4e1b4\" (UID: \"99316486-3c01-47ab-8924-cee49da4e1b4\") " Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.985471 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts" (OuterVolumeSpecName: "scripts") pod "99316486-3c01-47ab-8924-cee49da4e1b4" (UID: "99316486-3c01-47ab-8924-cee49da4e1b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:24 crc kubenswrapper[4909]: I1201 10:51:24.985554 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v" (OuterVolumeSpecName: "kube-api-access-s948v") pod "99316486-3c01-47ab-8924-cee49da4e1b4" (UID: "99316486-3c01-47ab-8924-cee49da4e1b4"). InnerVolumeSpecName "kube-api-access-s948v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.008090 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data" (OuterVolumeSpecName: "config-data") pod "99316486-3c01-47ab-8924-cee49da4e1b4" (UID: "99316486-3c01-47ab-8924-cee49da4e1b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.014120 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99316486-3c01-47ab-8924-cee49da4e1b4" (UID: "99316486-3c01-47ab-8924-cee49da4e1b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.080437 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.080494 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s948v\" (UniqueName: \"kubernetes.io/projected/99316486-3c01-47ab-8924-cee49da4e1b4-kube-api-access-s948v\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.080511 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.080524 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99316486-3c01-47ab-8924-cee49da4e1b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.447718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q8n7t" event={"ID":"99316486-3c01-47ab-8924-cee49da4e1b4","Type":"ContainerDied","Data":"5af6b501ee1b6adffdd037bccb5b587df3d33f7505da0153b8d9cd3edd6c0efa"} Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.447778 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af6b501ee1b6adffdd037bccb5b587df3d33f7505da0153b8d9cd3edd6c0efa" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.447845 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q8n7t" Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.627424 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.627915 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-api" containerID="cri-o://c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32" gracePeriod=30 Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.628176 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-log" containerID="cri-o://f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318" gracePeriod=30 Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.648412 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.648688 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerName="nova-scheduler-scheduler" containerID="cri-o://f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" gracePeriod=30 Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.668191 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.668440 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" containerID="cri-o://78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9" gracePeriod=30 Dec 01 10:51:25 crc kubenswrapper[4909]: I1201 10:51:25.668570 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" containerID="cri-o://aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43" gracePeriod=30 Dec 01 10:51:26 crc kubenswrapper[4909]: I1201 10:51:26.462956 4909 generic.go:334] "Generic (PLEG): container finished" podID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerID="78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9" exitCode=143 Dec 01 10:51:26 crc kubenswrapper[4909]: I1201 10:51:26.463039 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerDied","Data":"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9"} Dec 01 10:51:26 crc kubenswrapper[4909]: I1201 10:51:26.466982 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e80240d-74a5-47b9-99f8-aee705908f36" containerID="f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318" exitCode=143 Dec 01 10:51:26 crc kubenswrapper[4909]: I1201 10:51:26.467050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerDied","Data":"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318"} Dec 01 10:51:27 crc kubenswrapper[4909]: E1201 10:51:27.299032 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:51:27 crc kubenswrapper[4909]: E1201 10:51:27.300955 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:51:27 crc kubenswrapper[4909]: E1201 10:51:27.302558 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:51:27 crc kubenswrapper[4909]: E1201 10:51:27.302613 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerName="nova-scheduler-scheduler" Dec 01 10:51:28 crc kubenswrapper[4909]: I1201 10:51:28.827752 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": read tcp 10.217.0.2:60482->10.217.0.176:8775: read: connection reset by peer" Dec 01 10:51:28 crc kubenswrapper[4909]: I1201 10:51:28.827926 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": read tcp 10.217.0.2:60484->10.217.0.176:8775: read: connection reset by peer" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.312929 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.477820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs\") pod \"16639ce4-cbc0-437d-b43c-63f9800bb171\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.477946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle\") pod \"16639ce4-cbc0-437d-b43c-63f9800bb171\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.478041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs\") pod \"16639ce4-cbc0-437d-b43c-63f9800bb171\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.478178 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data\") pod \"16639ce4-cbc0-437d-b43c-63f9800bb171\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.478234 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q7fx\" (UniqueName: \"kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx\") pod \"16639ce4-cbc0-437d-b43c-63f9800bb171\" (UID: \"16639ce4-cbc0-437d-b43c-63f9800bb171\") " Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.479645 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs" (OuterVolumeSpecName: "logs") pod "16639ce4-cbc0-437d-b43c-63f9800bb171" (UID: "16639ce4-cbc0-437d-b43c-63f9800bb171"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.498136 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx" (OuterVolumeSpecName: "kube-api-access-5q7fx") pod "16639ce4-cbc0-437d-b43c-63f9800bb171" (UID: "16639ce4-cbc0-437d-b43c-63f9800bb171"). InnerVolumeSpecName "kube-api-access-5q7fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.501817 4909 generic.go:334] "Generic (PLEG): container finished" podID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerID="aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43" exitCode=0 Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.501960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerDied","Data":"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43"} Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.502026 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16639ce4-cbc0-437d-b43c-63f9800bb171","Type":"ContainerDied","Data":"92129a45fe76a976d0ab38f668df96709b14bff5d58f769f8314f0c5bd632f8a"} Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.502048 4909 scope.go:117] "RemoveContainer" containerID="aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.502288 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.512379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16639ce4-cbc0-437d-b43c-63f9800bb171" (UID: "16639ce4-cbc0-437d-b43c-63f9800bb171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.519144 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data" (OuterVolumeSpecName: "config-data") pod "16639ce4-cbc0-437d-b43c-63f9800bb171" (UID: "16639ce4-cbc0-437d-b43c-63f9800bb171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.553572 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "16639ce4-cbc0-437d-b43c-63f9800bb171" (UID: "16639ce4-cbc0-437d-b43c-63f9800bb171"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.580738 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.580774 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q7fx\" (UniqueName: \"kubernetes.io/projected/16639ce4-cbc0-437d-b43c-63f9800bb171-kube-api-access-5q7fx\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.580789 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.580798 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16639ce4-cbc0-437d-b43c-63f9800bb171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.580809 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16639ce4-cbc0-437d-b43c-63f9800bb171-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.614487 4909 scope.go:117] "RemoveContainer" containerID="78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.638234 4909 scope.go:117] "RemoveContainer" containerID="aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.638659 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43\": container with ID starting with aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43 not found: ID does not exist" containerID="aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.638692 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43"} err="failed to get container status \"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43\": rpc error: code = NotFound desc = could not find container \"aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43\": container with ID starting with aea4b0ba4bb88bf12308b28e3fea7a9e97026acf8921b55c9eea95799cd80f43 not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.638731 4909 scope.go:117] "RemoveContainer" containerID="78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.638992 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9\": container with ID starting with 78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9 not found: ID does not exist" containerID="78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.639008 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9"} err="failed to get container status \"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9\": rpc error: code = NotFound desc = could not find container \"78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9\": container with ID starting with 78337096400aa6a0bd1e985637ac0ed92032f9bacbb592090f0bb1a4de893fd9 not found: ID does not exist" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.839315 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.848225 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.895494 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.896182 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="dnsmasq-dns" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896198 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="dnsmasq-dns" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.896214 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896222 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.896251 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896258 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.896269 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99316486-3c01-47ab-8924-cee49da4e1b4" containerName="nova-manage" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896275 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="99316486-3c01-47ab-8924-cee49da4e1b4" containerName="nova-manage" Dec 01 10:51:29 crc kubenswrapper[4909]: E1201 10:51:29.896300 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="init" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896306 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="init" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896603 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="99316486-3c01-47ab-8924-cee49da4e1b4" containerName="nova-manage" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896627 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-log" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896646 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" containerName="nova-metadata-metadata" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.896668 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8e06da-87e5-467f-8938-0a349513a0c8" containerName="dnsmasq-dns" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.898130 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.901708 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.910731 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.913703 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.990695 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzv5\" (UniqueName: \"kubernetes.io/projected/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-kube-api-access-jrzv5\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.990769 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-logs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.990805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.990840 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:29 crc kubenswrapper[4909]: I1201 10:51:29.991001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-config-data\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.092331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-config-data\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.092448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-logs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.092473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzv5\" (UniqueName: \"kubernetes.io/projected/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-kube-api-access-jrzv5\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.092499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.092524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.093644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-logs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.097458 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.098420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.098441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-config-data\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.112172 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzv5\" (UniqueName: \"kubernetes.io/projected/da6bc7be-6a1b-42f5-ae7c-1c7a5288755e-kube-api-access-jrzv5\") pod \"nova-metadata-0\" (UID: \"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e\") " pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.254496 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.453999 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.513923 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e80240d-74a5-47b9-99f8-aee705908f36" containerID="c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32" exitCode=0 Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.514035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerDied","Data":"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32"} Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.514080 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7e80240d-74a5-47b9-99f8-aee705908f36","Type":"ContainerDied","Data":"43c0fae89f4a10503119833cd7d4d285113f3db25c0fdae6df7fdc49efed06d6"} Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.514103 4909 scope.go:117] "RemoveContainer" containerID="c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.514731 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.540373 4909 scope.go:117] "RemoveContainer" containerID="f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.576580 4909 scope.go:117] "RemoveContainer" containerID="c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32" Dec 01 10:51:30 crc kubenswrapper[4909]: E1201 10:51:30.579346 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32\": container with ID starting with c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32 not found: ID does not exist" containerID="c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.579483 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32"} err="failed to get container status \"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32\": rpc error: code = NotFound desc = could not find container \"c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32\": container with ID starting with c2cb4f81d7ba1bebbc8d07f7b69455334020eaa37d1193fbb1dfd01027d04a32 not found: ID does not exist" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.579512 4909 scope.go:117] "RemoveContainer" containerID="f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318" Dec 01 10:51:30 crc kubenswrapper[4909]: E1201 10:51:30.579973 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318\": container with ID starting with f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318 not found: ID does not exist" containerID="f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.580016 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318"} err="failed to get container status \"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318\": rpc error: code = NotFound desc = could not find container \"f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318\": container with ID starting with f5e56639b7bd4e29800bd8356961b7af21ffa51b7d2141e36b6f91644db32318 not found: ID does not exist" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.601792 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.602598 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs" (OuterVolumeSpecName: "logs") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.602914 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.602946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.603711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.603749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8mvk\" (UniqueName: \"kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.603859 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs\") pod \"7e80240d-74a5-47b9-99f8-aee705908f36\" (UID: \"7e80240d-74a5-47b9-99f8-aee705908f36\") " Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.605014 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e80240d-74a5-47b9-99f8-aee705908f36-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.608751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk" (OuterVolumeSpecName: "kube-api-access-h8mvk") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "kube-api-access-h8mvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.634120 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.635992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data" (OuterVolumeSpecName: "config-data") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.655994 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.657562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7e80240d-74a5-47b9-99f8-aee705908f36" (UID: "7e80240d-74a5-47b9-99f8-aee705908f36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.707406 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.707494 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.707507 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8mvk\" (UniqueName: \"kubernetes.io/projected/7e80240d-74a5-47b9-99f8-aee705908f36-kube-api-access-h8mvk\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.707519 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.707535 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e80240d-74a5-47b9-99f8-aee705908f36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.749459 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:51:30 crc kubenswrapper[4909]: W1201 10:51:30.750440 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda6bc7be_6a1b_42f5_ae7c_1c7a5288755e.slice/crio-4e180236d6898b6c0564fc045ab7776105d81d71447f164fe44c2ed4f58b03a0 WatchSource:0}: Error finding container 4e180236d6898b6c0564fc045ab7776105d81d71447f164fe44c2ed4f58b03a0: Status 404 returned error can't find the container with id 4e180236d6898b6c0564fc045ab7776105d81d71447f164fe44c2ed4f58b03a0 Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.856265 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.868124 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.883219 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:30 crc kubenswrapper[4909]: E1201 10:51:30.883724 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-log" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.883742 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-log" Dec 01 10:51:30 crc kubenswrapper[4909]: E1201 10:51:30.883768 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-api" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.883775 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-api" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.884016 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-log" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.884041 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" containerName="nova-api-api" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.885039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.887115 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.887230 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.887258 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:51:30 crc kubenswrapper[4909]: I1201 10:51:30.896141 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.015829 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.017173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-config-data\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.017327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7baabf-a92d-4e97-847f-aa1a692d206f-logs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.017364 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhw6f\" (UniqueName: \"kubernetes.io/projected/5d7baabf-a92d-4e97-847f-aa1a692d206f-kube-api-access-jhw6f\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.017476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.017583 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.120352 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.120414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-config-data\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.120458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7baabf-a92d-4e97-847f-aa1a692d206f-logs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.120480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhw6f\" (UniqueName: \"kubernetes.io/projected/5d7baabf-a92d-4e97-847f-aa1a692d206f-kube-api-access-jhw6f\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.121341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.121417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.121658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7baabf-a92d-4e97-847f-aa1a692d206f-logs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.126080 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.127112 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-config-data\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.127816 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.131033 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7baabf-a92d-4e97-847f-aa1a692d206f-public-tls-certs\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.149628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhw6f\" (UniqueName: \"kubernetes.io/projected/5d7baabf-a92d-4e97-847f-aa1a692d206f-kube-api-access-jhw6f\") pod \"nova-api-0\" (UID: \"5d7baabf-a92d-4e97-847f-aa1a692d206f\") " pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.268256 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16639ce4-cbc0-437d-b43c-63f9800bb171" path="/var/lib/kubelet/pods/16639ce4-cbc0-437d-b43c-63f9800bb171/volumes" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.269120 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e80240d-74a5-47b9-99f8-aee705908f36" path="/var/lib/kubelet/pods/7e80240d-74a5-47b9-99f8-aee705908f36/volumes" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.416087 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.539064 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e","Type":"ContainerStarted","Data":"54c84aecd4d07161193d9b532e613c21b87479edc7b1e4df2822e51074c58e3e"} Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.539601 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e","Type":"ContainerStarted","Data":"69dd32508c209b923e216e0fa994e02e96a98b96edec85b6aed2fd0242edc263"} Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.539615 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da6bc7be-6a1b-42f5-ae7c-1c7a5288755e","Type":"ContainerStarted","Data":"4e180236d6898b6c0564fc045ab7776105d81d71447f164fe44c2ed4f58b03a0"} Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.557526 4909 generic.go:334] "Generic (PLEG): container finished" podID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerID="f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" exitCode=0 Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.557579 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a","Type":"ContainerDied","Data":"f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960"} Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.580637 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.580616062 podStartE2EDuration="2.580616062s" podCreationTimestamp="2025-12-01 10:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:31.567892472 +0000 UTC m=+1208.802363390" watchObservedRunningTime="2025-12-01 10:51:31.580616062 +0000 UTC m=+1208.815086950" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.621970 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.733355 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data\") pod \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.733533 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhkfg\" (UniqueName: \"kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg\") pod \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.733683 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle\") pod \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\" (UID: \"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a\") " Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.741042 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg" (OuterVolumeSpecName: "kube-api-access-hhkfg") pod "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" (UID: "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a"). InnerVolumeSpecName "kube-api-access-hhkfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.766522 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" (UID: "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.768422 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data" (OuterVolumeSpecName: "config-data") pod "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" (UID: "b421bbb3-25fb-4eb5-ac53-a03f2f941b6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.835651 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.835952 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.835964 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhkfg\" (UniqueName: \"kubernetes.io/projected/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a-kube-api-access-hhkfg\") on node \"crc\" DevicePath \"\"" Dec 01 10:51:31 crc kubenswrapper[4909]: I1201 10:51:31.932857 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:51:31 crc kubenswrapper[4909]: W1201 10:51:31.940152 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d7baabf_a92d_4e97_847f_aa1a692d206f.slice/crio-e1c0ebb7b5e9a6b1c7015f7f4eb03ff55a215d34c88c317da42069a8e98fbbb1 WatchSource:0}: Error finding container e1c0ebb7b5e9a6b1c7015f7f4eb03ff55a215d34c88c317da42069a8e98fbbb1: Status 404 returned error can't find the container with id e1c0ebb7b5e9a6b1c7015f7f4eb03ff55a215d34c88c317da42069a8e98fbbb1 Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.571263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b421bbb3-25fb-4eb5-ac53-a03f2f941b6a","Type":"ContainerDied","Data":"1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215"} Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.571547 4909 scope.go:117] "RemoveContainer" containerID="f6e3d23adda41159f95ab80623ddd7858758ac4019ff386005a408f5414d7960" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.571271 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.583465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d7baabf-a92d-4e97-847f-aa1a692d206f","Type":"ContainerStarted","Data":"8080f91bccbed4908383dedf68cb98b8a58f2319cc3674c6dc25d81b7e58a786"} Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.583509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d7baabf-a92d-4e97-847f-aa1a692d206f","Type":"ContainerStarted","Data":"f418735d3439b2a20c1eb1ceecea25d53351f2c959c8f7aef2e8edf15a27e14c"} Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.583521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5d7baabf-a92d-4e97-847f-aa1a692d206f","Type":"ContainerStarted","Data":"e1c0ebb7b5e9a6b1c7015f7f4eb03ff55a215d34c88c317da42069a8e98fbbb1"} Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.623141 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.623113667 podStartE2EDuration="2.623113667s" podCreationTimestamp="2025-12-01 10:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:32.60454975 +0000 UTC m=+1209.839020658" watchObservedRunningTime="2025-12-01 10:51:32.623113667 +0000 UTC m=+1209.857584565" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.677785 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.702616 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.712029 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:32 crc kubenswrapper[4909]: E1201 10:51:32.712506 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerName="nova-scheduler-scheduler" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.712528 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerName="nova-scheduler-scheduler" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.712751 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" containerName="nova-scheduler-scheduler" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.713582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.716471 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.727200 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:32 crc kubenswrapper[4909]: E1201 10:51:32.839936 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb421bbb3_25fb_4eb5_ac53_a03f2f941b6a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb421bbb3_25fb_4eb5_ac53_a03f2f941b6a.slice/crio-1f4a6c028440273226731ba4f9413be19b4f31c96432bb53889e9a2174acf215\": RecentStats: unable to find data in memory cache]" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.857305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pf6\" (UniqueName: \"kubernetes.io/projected/536ef4f2-2531-42e1-8c55-44e21e282a03-kube-api-access-p4pf6\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.857401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-config-data\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.857616 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.959406 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.959538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pf6\" (UniqueName: \"kubernetes.io/projected/536ef4f2-2531-42e1-8c55-44e21e282a03-kube-api-access-p4pf6\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.959598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-config-data\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.965410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-config-data\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.967279 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536ef4f2-2531-42e1-8c55-44e21e282a03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:32 crc kubenswrapper[4909]: I1201 10:51:32.977853 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pf6\" (UniqueName: \"kubernetes.io/projected/536ef4f2-2531-42e1-8c55-44e21e282a03-kube-api-access-p4pf6\") pod \"nova-scheduler-0\" (UID: \"536ef4f2-2531-42e1-8c55-44e21e282a03\") " pod="openstack/nova-scheduler-0" Dec 01 10:51:33 crc kubenswrapper[4909]: I1201 10:51:33.037862 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:51:33 crc kubenswrapper[4909]: I1201 10:51:33.270532 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b421bbb3-25fb-4eb5-ac53-a03f2f941b6a" path="/var/lib/kubelet/pods/b421bbb3-25fb-4eb5-ac53-a03f2f941b6a/volumes" Dec 01 10:51:33 crc kubenswrapper[4909]: I1201 10:51:33.332337 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:51:33 crc kubenswrapper[4909]: W1201 10:51:33.332684 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536ef4f2_2531_42e1_8c55_44e21e282a03.slice/crio-97de7e00a7c3b797554dff381dfc16bc02058b6bab1bf2612db3958d4e807ec7 WatchSource:0}: Error finding container 97de7e00a7c3b797554dff381dfc16bc02058b6bab1bf2612db3958d4e807ec7: Status 404 returned error can't find the container with id 97de7e00a7c3b797554dff381dfc16bc02058b6bab1bf2612db3958d4e807ec7 Dec 01 10:51:33 crc kubenswrapper[4909]: I1201 10:51:33.591536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536ef4f2-2531-42e1-8c55-44e21e282a03","Type":"ContainerStarted","Data":"97de7e00a7c3b797554dff381dfc16bc02058b6bab1bf2612db3958d4e807ec7"} Dec 01 10:51:34 crc kubenswrapper[4909]: I1201 10:51:34.604702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"536ef4f2-2531-42e1-8c55-44e21e282a03","Type":"ContainerStarted","Data":"c9aecfbd24f4e63c511e03d2fa4297306cfdfa7fe92ae60986c74aaeabb04cef"} Dec 01 10:51:35 crc kubenswrapper[4909]: I1201 10:51:35.255470 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:51:35 crc kubenswrapper[4909]: I1201 10:51:35.273045 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:51:38 crc kubenswrapper[4909]: I1201 10:51:38.038048 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:51:40 crc kubenswrapper[4909]: I1201 10:51:40.255788 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:51:40 crc kubenswrapper[4909]: I1201 10:51:40.256473 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.274201 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da6bc7be-6a1b-42f5-ae7c-1c7a5288755e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.274215 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da6bc7be-6a1b-42f5-ae7c-1c7a5288755e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.416811 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.416854 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.953831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:51:41 crc kubenswrapper[4909]: I1201 10:51:41.983644 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=9.983607935 podStartE2EDuration="9.983607935s" podCreationTimestamp="2025-12-01 10:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:51:34.632284651 +0000 UTC m=+1211.866755549" watchObservedRunningTime="2025-12-01 10:51:41.983607935 +0000 UTC m=+1219.218078853" Dec 01 10:51:42 crc kubenswrapper[4909]: I1201 10:51:42.431216 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d7baabf-a92d-4e97-847f-aa1a692d206f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:42 crc kubenswrapper[4909]: I1201 10:51:42.431264 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5d7baabf-a92d-4e97-847f-aa1a692d206f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:51:43 crc kubenswrapper[4909]: I1201 10:51:43.039112 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:51:43 crc kubenswrapper[4909]: I1201 10:51:43.068118 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:51:43 crc kubenswrapper[4909]: I1201 10:51:43.740809 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:51:50 crc kubenswrapper[4909]: I1201 10:51:50.261683 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:51:50 crc kubenswrapper[4909]: I1201 10:51:50.262435 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:51:50 crc kubenswrapper[4909]: I1201 10:51:50.267698 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:51:50 crc kubenswrapper[4909]: I1201 10:51:50.273972 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.424751 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.425589 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.425773 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.440387 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.779433 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:51:51 crc kubenswrapper[4909]: I1201 10:51:51.786512 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:51:59 crc kubenswrapper[4909]: I1201 10:51:59.958760 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:01 crc kubenswrapper[4909]: I1201 10:52:01.115393 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:04 crc kubenswrapper[4909]: I1201 10:52:04.976378 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="rabbitmq" containerID="cri-o://5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2" gracePeriod=604795 Dec 01 10:52:05 crc kubenswrapper[4909]: I1201 10:52:05.827124 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="rabbitmq" containerID="cri-o://97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214" gracePeriod=604796 Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.546170 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.721487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.721850 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwwb\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.721934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722000 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722029 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722062 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722099 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722130 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722231 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722292 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.722382 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info\") pod \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\" (UID: \"fa1d0c2b-1efc-451b-9fe5-58debd89810e\") " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.726363 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.727763 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.728790 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.729978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.734226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.735139 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb" (OuterVolumeSpecName: "kube-api-access-kdwwb") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "kube-api-access-kdwwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.735549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info" (OuterVolumeSpecName: "pod-info") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.736669 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.769370 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data" (OuterVolumeSpecName: "config-data") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.790621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf" (OuterVolumeSpecName: "server-conf") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.824972 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825008 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825023 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa1d0c2b-1efc-451b-9fe5-58debd89810e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825034 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa1d0c2b-1efc-451b-9fe5-58debd89810e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825047 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwwb\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-kube-api-access-kdwwb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825058 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825069 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825080 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825090 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa1d0c2b-1efc-451b-9fe5-58debd89810e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.825100 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.845227 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fa1d0c2b-1efc-451b-9fe5-58debd89810e" (UID: "fa1d0c2b-1efc-451b-9fe5-58debd89810e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.853308 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.927366 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa1d0c2b-1efc-451b-9fe5-58debd89810e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.927400 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.955895 4909 generic.go:334] "Generic (PLEG): container finished" podID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerID="5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2" exitCode=0 Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.955966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerDied","Data":"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2"} Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.955976 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.956000 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fa1d0c2b-1efc-451b-9fe5-58debd89810e","Type":"ContainerDied","Data":"70fdfaffb7938545bb4616a640ded8cd8644487fb24cb64f6b1eae8c7fe0e856"} Dec 01 10:52:11 crc kubenswrapper[4909]: I1201 10:52:11.956021 4909 scope.go:117] "RemoveContainer" containerID="5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.031189 4909 scope.go:117] "RemoveContainer" containerID="ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.069696 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.086703 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.105516 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:12 crc kubenswrapper[4909]: E1201 10:52:12.105991 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="setup-container" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.106012 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="setup-container" Dec 01 10:52:12 crc kubenswrapper[4909]: E1201 10:52:12.106026 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="rabbitmq" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.106033 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="rabbitmq" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.106229 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" containerName="rabbitmq" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.107295 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.110743 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.111152 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.111336 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.111643 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.112098 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.112281 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.113076 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cvcbl" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.118055 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.165383 4909 scope.go:117] "RemoveContainer" containerID="5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2" Dec 01 10:52:12 crc kubenswrapper[4909]: E1201 10:52:12.166221 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2\": container with ID starting with 5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2 not found: ID does not exist" containerID="5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.166252 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2"} err="failed to get container status \"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2\": rpc error: code = NotFound desc = could not find container \"5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2\": container with ID starting with 5d6d7cda93afc9f486e8d753e671c51f23c70cf9e4c5e73c9c6b304d85d4bad2 not found: ID does not exist" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.166280 4909 scope.go:117] "RemoveContainer" containerID="ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59" Dec 01 10:52:12 crc kubenswrapper[4909]: E1201 10:52:12.166543 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59\": container with ID starting with ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59 not found: ID does not exist" containerID="ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.166570 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59"} err="failed to get container status \"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59\": rpc error: code = NotFound desc = could not find container \"ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59\": container with ID starting with ff97b98aa46f1e357eecf9003d1f326ad5676fd78ea78233559f9b378f7c8f59 not found: ID does not exist" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234268 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4l8\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-kube-api-access-jk4l8\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234310 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234335 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234422 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234455 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234512 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-config-data\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.234539 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.337676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.337796 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-config-data\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.337839 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.337935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.338014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.338984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.338990 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.338532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4l8\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-kube-api-access-jk4l8\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339144 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-config-data\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339173 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339237 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339464 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.339237 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.343762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.344367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.344732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.356845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.361887 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4l8\" (UniqueName: \"kubernetes.io/projected/ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b-kube-api-access-jk4l8\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.383789 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b\") " pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.447415 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.519747 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645716 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645766 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbl7d\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.645965 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.646026 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.646132 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.646160 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.646194 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.646229 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf\") pod \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\" (UID: \"226ba07f-6dee-4f12-9d0e-4ae327457c2e\") " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.647453 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.648320 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.649104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.658659 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d" (OuterVolumeSpecName: "kube-api-access-zbl7d") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "kube-api-access-zbl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.658835 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.659328 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info" (OuterVolumeSpecName: "pod-info") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.659685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.659747 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.687572 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data" (OuterVolumeSpecName: "config-data") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.709988 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf" (OuterVolumeSpecName: "server-conf") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749286 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/226ba07f-6dee-4f12-9d0e-4ae327457c2e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749322 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749334 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749373 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749387 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749401 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749411 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/226ba07f-6dee-4f12-9d0e-4ae327457c2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749422 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749433 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/226ba07f-6dee-4f12-9d0e-4ae327457c2e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.749443 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbl7d\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-kube-api-access-zbl7d\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.760788 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "226ba07f-6dee-4f12-9d0e-4ae327457c2e" (UID: "226ba07f-6dee-4f12-9d0e-4ae327457c2e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.771686 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.851777 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/226ba07f-6dee-4f12-9d0e-4ae327457c2e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.851815 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.962346 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.969579 4909 generic.go:334] "Generic (PLEG): container finished" podID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerID="97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214" exitCode=0 Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.969647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerDied","Data":"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214"} Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.969682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"226ba07f-6dee-4f12-9d0e-4ae327457c2e","Type":"ContainerDied","Data":"6f67d1f2c7376924ecb88bdb43bcc5af5dd5f5e04bd11ac9f731f0ed2ef413c4"} Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.969699 4909 scope.go:117] "RemoveContainer" containerID="97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214" Dec 01 10:52:12 crc kubenswrapper[4909]: I1201 10:52:12.969651 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.011207 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.019248 4909 scope.go:117] "RemoveContainer" containerID="decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.027388 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.043316 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:13 crc kubenswrapper[4909]: E1201 10:52:13.043827 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="setup-container" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.043845 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="setup-container" Dec 01 10:52:13 crc kubenswrapper[4909]: E1201 10:52:13.043899 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="rabbitmq" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.043909 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="rabbitmq" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.044155 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" containerName="rabbitmq" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.045418 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.054525 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.054867 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.055124 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.055371 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.055593 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.055840 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.056707 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pcgf7" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.064659 4909 scope.go:117] "RemoveContainer" containerID="97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214" Dec 01 10:52:13 crc kubenswrapper[4909]: E1201 10:52:13.069121 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214\": container with ID starting with 97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214 not found: ID does not exist" containerID="97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.069176 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214"} err="failed to get container status \"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214\": rpc error: code = NotFound desc = could not find container \"97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214\": container with ID starting with 97666d565341b4b350a300b0bc8e32c12b05348bf42108d20b498e93f8aea214 not found: ID does not exist" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.069212 4909 scope.go:117] "RemoveContainer" containerID="decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.072608 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:13 crc kubenswrapper[4909]: E1201 10:52:13.073417 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a\": container with ID starting with decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a not found: ID does not exist" containerID="decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.073468 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a"} err="failed to get container status \"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a\": rpc error: code = NotFound desc = could not find container \"decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a\": container with ID starting with decbb764066fab68413640f5ed91146e03e1ed0ca0234962058e3d16b081d56a not found: ID does not exist" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46315eac-b29e-48fa-864d-f105eefd2fc3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164672 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zh6\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-kube-api-access-q6zh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164747 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164897 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.164987 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.165069 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46315eac-b29e-48fa-864d-f105eefd2fc3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.165116 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.165136 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.165231 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.165302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zh6\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-kube-api-access-q6zh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267804 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46315eac-b29e-48fa-864d-f105eefd2fc3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267957 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.267983 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.268044 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.268102 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.268164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46315eac-b29e-48fa-864d-f105eefd2fc3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.268849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.269930 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.270233 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.271794 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.272061 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="226ba07f-6dee-4f12-9d0e-4ae327457c2e" path="/var/lib/kubelet/pods/226ba07f-6dee-4f12-9d0e-4ae327457c2e/volumes" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.272751 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.273484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/46315eac-b29e-48fa-864d-f105eefd2fc3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.275519 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/46315eac-b29e-48fa-864d-f105eefd2fc3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.275691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/46315eac-b29e-48fa-864d-f105eefd2fc3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.275982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.276407 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1d0c2b-1efc-451b-9fe5-58debd89810e" path="/var/lib/kubelet/pods/fa1d0c2b-1efc-451b-9fe5-58debd89810e/volumes" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.280550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.295278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zh6\" (UniqueName: \"kubernetes.io/projected/46315eac-b29e-48fa-864d-f105eefd2fc3-kube-api-access-q6zh6\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.316999 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"46315eac-b29e-48fa-864d-f105eefd2fc3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.409378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:13 crc kubenswrapper[4909]: I1201 10:52:13.777531 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:52:14 crc kubenswrapper[4909]: I1201 10:52:13.989167 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46315eac-b29e-48fa-864d-f105eefd2fc3","Type":"ContainerStarted","Data":"a4f75cc08359d3218714b0008d1b388bcad8c86a28d9ae541dbb15dc0540b2d9"} Dec 01 10:52:14 crc kubenswrapper[4909]: I1201 10:52:13.997557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b","Type":"ContainerStarted","Data":"a8b624e279384e4de5755279527cfdde17d448e85d5e9d5b7db68acd87e5778f"} Dec 01 10:52:15 crc kubenswrapper[4909]: I1201 10:52:15.007976 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b","Type":"ContainerStarted","Data":"d888684a9f294c58fc65acc3cf78ac33a1f75d2b89f618d5d4a1336a08c15238"} Dec 01 10:52:15 crc kubenswrapper[4909]: I1201 10:52:15.891820 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:15 crc kubenswrapper[4909]: I1201 10:52:15.903977 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:15 crc kubenswrapper[4909]: I1201 10:52:15.908969 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 10:52:15 crc kubenswrapper[4909]: I1201 10:52:15.950069 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.016586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46315eac-b29e-48fa-864d-f105eefd2fc3","Type":"ContainerStarted","Data":"14a87b2789a45a8edbc95211bd5c17b090ce5f005fb5bcc43041f9315ea476f7"} Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028740 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrtw\" (UniqueName: \"kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.028796 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.130829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrtw\" (UniqueName: \"kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.130986 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.131037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.132203 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.132669 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.133109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.134329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.135787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.137089 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.138588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.138688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.154797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrtw\" (UniqueName: \"kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw\") pod \"dnsmasq-dns-6447ccbd8f-l5g9t\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.233422 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:16 crc kubenswrapper[4909]: I1201 10:52:16.721864 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:16 crc kubenswrapper[4909]: W1201 10:52:16.740806 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c2adbca_b7df_48fd_82a8_15f8ef7ae458.slice/crio-ec92142d6f2bfc43867bd917f87ac4baa3708a7bd9a11b90e8f8129250e87e56 WatchSource:0}: Error finding container ec92142d6f2bfc43867bd917f87ac4baa3708a7bd9a11b90e8f8129250e87e56: Status 404 returned error can't find the container with id ec92142d6f2bfc43867bd917f87ac4baa3708a7bd9a11b90e8f8129250e87e56 Dec 01 10:52:17 crc kubenswrapper[4909]: I1201 10:52:17.025888 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerID="076ca75e29baa46b8a0a8042e214f8451420dd2ac827711a8869c4882e802ed7" exitCode=0 Dec 01 10:52:17 crc kubenswrapper[4909]: I1201 10:52:17.025937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" event={"ID":"1c2adbca-b7df-48fd-82a8-15f8ef7ae458","Type":"ContainerDied","Data":"076ca75e29baa46b8a0a8042e214f8451420dd2ac827711a8869c4882e802ed7"} Dec 01 10:52:17 crc kubenswrapper[4909]: I1201 10:52:17.027019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" event={"ID":"1c2adbca-b7df-48fd-82a8-15f8ef7ae458","Type":"ContainerStarted","Data":"ec92142d6f2bfc43867bd917f87ac4baa3708a7bd9a11b90e8f8129250e87e56"} Dec 01 10:52:18 crc kubenswrapper[4909]: I1201 10:52:18.038070 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" event={"ID":"1c2adbca-b7df-48fd-82a8-15f8ef7ae458","Type":"ContainerStarted","Data":"3e12dd37abeaf68bf3073d5e1d7021c861f47c7ff85b24cbfe084739b5524a82"} Dec 01 10:52:18 crc kubenswrapper[4909]: I1201 10:52:18.038654 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:18 crc kubenswrapper[4909]: I1201 10:52:18.062686 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" podStartSLOduration=3.062670002 podStartE2EDuration="3.062670002s" podCreationTimestamp="2025-12-01 10:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:18.060568555 +0000 UTC m=+1255.295039473" watchObservedRunningTime="2025-12-01 10:52:18.062670002 +0000 UTC m=+1255.297140900" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.236105 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.293754 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.293989 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="dnsmasq-dns" containerID="cri-o://507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2" gracePeriod=10 Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.444337 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4f592"] Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.455317 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.469493 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4f592"] Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.537710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.537814 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-config\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.537847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.537869 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tllfl\" (UniqueName: \"kubernetes.io/projected/68b41abf-32f2-4e47-b5ac-f1e689edda28-kube-api-access-tllfl\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.537996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.538018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-config\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tllfl\" (UniqueName: \"kubernetes.io/projected/68b41abf-32f2-4e47-b5ac-f1e689edda28-kube-api-access-tllfl\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639747 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.639897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.640629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-config\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.641355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.642174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.642382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.645427 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68b41abf-32f2-4e47-b5ac-f1e689edda28-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.682402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tllfl\" (UniqueName: \"kubernetes.io/projected/68b41abf-32f2-4e47-b5ac-f1e689edda28-kube-api-access-tllfl\") pod \"dnsmasq-dns-864d5fc68c-4f592\" (UID: \"68b41abf-32f2-4e47-b5ac-f1e689edda28\") " pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.788207 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.929385 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.943765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config\") pod \"63a9d1b0-6050-4bf6-b247-aea03752927e\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.943816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb\") pod \"63a9d1b0-6050-4bf6-b247-aea03752927e\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.944101 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29cl8\" (UniqueName: \"kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8\") pod \"63a9d1b0-6050-4bf6-b247-aea03752927e\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.944351 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb\") pod \"63a9d1b0-6050-4bf6-b247-aea03752927e\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.944435 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc\") pod \"63a9d1b0-6050-4bf6-b247-aea03752927e\" (UID: \"63a9d1b0-6050-4bf6-b247-aea03752927e\") " Dec 01 10:52:26 crc kubenswrapper[4909]: I1201 10:52:26.956060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8" (OuterVolumeSpecName: "kube-api-access-29cl8") pod "63a9d1b0-6050-4bf6-b247-aea03752927e" (UID: "63a9d1b0-6050-4bf6-b247-aea03752927e"). InnerVolumeSpecName "kube-api-access-29cl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.010226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63a9d1b0-6050-4bf6-b247-aea03752927e" (UID: "63a9d1b0-6050-4bf6-b247-aea03752927e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.022830 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63a9d1b0-6050-4bf6-b247-aea03752927e" (UID: "63a9d1b0-6050-4bf6-b247-aea03752927e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.028476 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "63a9d1b0-6050-4bf6-b247-aea03752927e" (UID: "63a9d1b0-6050-4bf6-b247-aea03752927e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.035309 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config" (OuterVolumeSpecName: "config") pod "63a9d1b0-6050-4bf6-b247-aea03752927e" (UID: "63a9d1b0-6050-4bf6-b247-aea03752927e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.046769 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.046791 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.046802 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.046811 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a9d1b0-6050-4bf6-b247-aea03752927e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.046822 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29cl8\" (UniqueName: \"kubernetes.io/projected/63a9d1b0-6050-4bf6-b247-aea03752927e-kube-api-access-29cl8\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.113004 4909 generic.go:334] "Generic (PLEG): container finished" podID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerID="507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2" exitCode=0 Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.113047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" event={"ID":"63a9d1b0-6050-4bf6-b247-aea03752927e","Type":"ContainerDied","Data":"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2"} Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.113100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" event={"ID":"63a9d1b0-6050-4bf6-b247-aea03752927e","Type":"ContainerDied","Data":"b4609f705960e4c94f463f1e736d2d5c99644409d49011bccd0362d16789e236"} Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.113122 4909 scope.go:117] "RemoveContainer" containerID="507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.113059 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.136010 4909 scope.go:117] "RemoveContainer" containerID="32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.153090 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.179105 4909 scope.go:117] "RemoveContainer" containerID="507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2" Dec 01 10:52:27 crc kubenswrapper[4909]: E1201 10:52:27.179702 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2\": container with ID starting with 507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2 not found: ID does not exist" containerID="507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.179779 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2"} err="failed to get container status \"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2\": rpc error: code = NotFound desc = could not find container \"507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2\": container with ID starting with 507523ee001f6a4a4511910a78049b4a41e08c65abb6ff2119c45f62ed7230c2 not found: ID does not exist" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.179820 4909 scope.go:117] "RemoveContainer" containerID="32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99" Dec 01 10:52:27 crc kubenswrapper[4909]: E1201 10:52:27.180149 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99\": container with ID starting with 32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99 not found: ID does not exist" containerID="32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.180179 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99"} err="failed to get container status \"32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99\": rpc error: code = NotFound desc = could not find container \"32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99\": container with ID starting with 32688dbaea43a969a5ad4773ee5a21a91bf5cf07009ca0e97c9fe19bfcbc9f99 not found: ID does not exist" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.186853 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-94dvb"] Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.271711 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" path="/var/lib/kubelet/pods/63a9d1b0-6050-4bf6-b247-aea03752927e/volumes" Dec 01 10:52:27 crc kubenswrapper[4909]: I1201 10:52:27.273252 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4f592"] Dec 01 10:52:27 crc kubenswrapper[4909]: W1201 10:52:27.275100 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b41abf_32f2_4e47_b5ac_f1e689edda28.slice/crio-3d34c6bf0b1dae910538ec1c6e4e56aa0aea423384e7431aaaaf0e5324fd0c5f WatchSource:0}: Error finding container 3d34c6bf0b1dae910538ec1c6e4e56aa0aea423384e7431aaaaf0e5324fd0c5f: Status 404 returned error can't find the container with id 3d34c6bf0b1dae910538ec1c6e4e56aa0aea423384e7431aaaaf0e5324fd0c5f Dec 01 10:52:28 crc kubenswrapper[4909]: I1201 10:52:28.126240 4909 generic.go:334] "Generic (PLEG): container finished" podID="68b41abf-32f2-4e47-b5ac-f1e689edda28" containerID="854980238a46894c74499b7ef0dea19c6204b0acade775fc31bb654c86d239b2" exitCode=0 Dec 01 10:52:28 crc kubenswrapper[4909]: I1201 10:52:28.126321 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" event={"ID":"68b41abf-32f2-4e47-b5ac-f1e689edda28","Type":"ContainerDied","Data":"854980238a46894c74499b7ef0dea19c6204b0acade775fc31bb654c86d239b2"} Dec 01 10:52:28 crc kubenswrapper[4909]: I1201 10:52:28.126690 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" event={"ID":"68b41abf-32f2-4e47-b5ac-f1e689edda28","Type":"ContainerStarted","Data":"3d34c6bf0b1dae910538ec1c6e4e56aa0aea423384e7431aaaaf0e5324fd0c5f"} Dec 01 10:52:29 crc kubenswrapper[4909]: I1201 10:52:29.137065 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" event={"ID":"68b41abf-32f2-4e47-b5ac-f1e689edda28","Type":"ContainerStarted","Data":"f03845d6f3dfd49f2d7251cc1ec986be1e935ecce219b9c68cb5ea978c0fb6d3"} Dec 01 10:52:29 crc kubenswrapper[4909]: I1201 10:52:29.137517 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:29 crc kubenswrapper[4909]: I1201 10:52:29.159425 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" podStartSLOduration=3.159405176 podStartE2EDuration="3.159405176s" podCreationTimestamp="2025-12-01 10:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:29.15205895 +0000 UTC m=+1266.386529858" watchObservedRunningTime="2025-12-01 10:52:29.159405176 +0000 UTC m=+1266.393876074" Dec 01 10:52:31 crc kubenswrapper[4909]: I1201 10:52:31.764735 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-94dvb" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.180:5353: i/o timeout" Dec 01 10:52:36 crc kubenswrapper[4909]: I1201 10:52:36.193662 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:52:36 crc kubenswrapper[4909]: I1201 10:52:36.194601 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:52:36 crc kubenswrapper[4909]: I1201 10:52:36.790080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-4f592" Dec 01 10:52:36 crc kubenswrapper[4909]: I1201 10:52:36.855272 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:36 crc kubenswrapper[4909]: I1201 10:52:36.855517 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="dnsmasq-dns" containerID="cri-o://3e12dd37abeaf68bf3073d5e1d7021c861f47c7ff85b24cbfe084739b5524a82" gracePeriod=10 Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.226468 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerID="3e12dd37abeaf68bf3073d5e1d7021c861f47c7ff85b24cbfe084739b5524a82" exitCode=0 Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.226579 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" event={"ID":"1c2adbca-b7df-48fd-82a8-15f8ef7ae458","Type":"ContainerDied","Data":"3e12dd37abeaf68bf3073d5e1d7021c861f47c7ff85b24cbfe084739b5524a82"} Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.430689 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.556423 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.556907 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.556999 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndrtw\" (UniqueName: \"kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.557034 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.557120 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.557152 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb\") pod \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\" (UID: \"1c2adbca-b7df-48fd-82a8-15f8ef7ae458\") " Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.564283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw" (OuterVolumeSpecName: "kube-api-access-ndrtw") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "kube-api-access-ndrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.609097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config" (OuterVolumeSpecName: "config") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.612346 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.621141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.621594 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.622895 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c2adbca-b7df-48fd-82a8-15f8ef7ae458" (UID: "1c2adbca-b7df-48fd-82a8-15f8ef7ae458"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659118 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659151 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659165 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndrtw\" (UniqueName: \"kubernetes.io/projected/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-kube-api-access-ndrtw\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659177 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659185 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:37 crc kubenswrapper[4909]: I1201 10:52:37.659193 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c2adbca-b7df-48fd-82a8-15f8ef7ae458-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.239019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" event={"ID":"1c2adbca-b7df-48fd-82a8-15f8ef7ae458","Type":"ContainerDied","Data":"ec92142d6f2bfc43867bd917f87ac4baa3708a7bd9a11b90e8f8129250e87e56"} Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.239095 4909 scope.go:117] "RemoveContainer" containerID="3e12dd37abeaf68bf3073d5e1d7021c861f47c7ff85b24cbfe084739b5524a82" Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.239271 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-l5g9t" Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.276763 4909 scope.go:117] "RemoveContainer" containerID="076ca75e29baa46b8a0a8042e214f8451420dd2ac827711a8869c4882e802ed7" Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.285113 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:38 crc kubenswrapper[4909]: I1201 10:52:38.298261 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-l5g9t"] Dec 01 10:52:39 crc kubenswrapper[4909]: I1201 10:52:39.277615 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" path="/var/lib/kubelet/pods/1c2adbca-b7df-48fd-82a8-15f8ef7ae458/volumes" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.528045 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd"] Dec 01 10:52:42 crc kubenswrapper[4909]: E1201 10:52:42.528915 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="init" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.528927 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="init" Dec 01 10:52:42 crc kubenswrapper[4909]: E1201 10:52:42.528948 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.528956 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: E1201 10:52:42.528970 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="init" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.528979 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="init" Dec 01 10:52:42 crc kubenswrapper[4909]: E1201 10:52:42.528997 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.529006 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.529223 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2adbca-b7df-48fd-82a8-15f8ef7ae458" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.529249 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a9d1b0-6050-4bf6-b247-aea03752927e" containerName="dnsmasq-dns" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.530017 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.539950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.540183 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.541643 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.542056 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.555384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd"] Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.665207 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474nn\" (UniqueName: \"kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.665270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.665345 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.665384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.767179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.767235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.767310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-474nn\" (UniqueName: \"kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.767344 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.773944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.774780 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.778374 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.786023 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-474nn\" (UniqueName: \"kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:42 crc kubenswrapper[4909]: I1201 10:52:42.851698 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:52:43 crc kubenswrapper[4909]: I1201 10:52:43.395310 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd"] Dec 01 10:52:43 crc kubenswrapper[4909]: W1201 10:52:43.411645 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07c812b5_85e1_4e57_bbf8_139a0a4e71d0.slice/crio-96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e WatchSource:0}: Error finding container 96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e: Status 404 returned error can't find the container with id 96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e Dec 01 10:52:43 crc kubenswrapper[4909]: I1201 10:52:43.415227 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:52:44 crc kubenswrapper[4909]: I1201 10:52:44.318081 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" event={"ID":"07c812b5-85e1-4e57-bbf8-139a0a4e71d0","Type":"ContainerStarted","Data":"96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e"} Dec 01 10:52:47 crc kubenswrapper[4909]: I1201 10:52:47.354751 4909 generic.go:334] "Generic (PLEG): container finished" podID="ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b" containerID="d888684a9f294c58fc65acc3cf78ac33a1f75d2b89f618d5d4a1336a08c15238" exitCode=0 Dec 01 10:52:47 crc kubenswrapper[4909]: I1201 10:52:47.354859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b","Type":"ContainerDied","Data":"d888684a9f294c58fc65acc3cf78ac33a1f75d2b89f618d5d4a1336a08c15238"} Dec 01 10:52:48 crc kubenswrapper[4909]: I1201 10:52:48.367835 4909 generic.go:334] "Generic (PLEG): container finished" podID="46315eac-b29e-48fa-864d-f105eefd2fc3" containerID="14a87b2789a45a8edbc95211bd5c17b090ce5f005fb5bcc43041f9315ea476f7" exitCode=0 Dec 01 10:52:48 crc kubenswrapper[4909]: I1201 10:52:48.367904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46315eac-b29e-48fa-864d-f105eefd2fc3","Type":"ContainerDied","Data":"14a87b2789a45a8edbc95211bd5c17b090ce5f005fb5bcc43041f9315ea476f7"} Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.436132 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b","Type":"ContainerStarted","Data":"464180f2216b93f17abeb3fba5991d881904e2d048bae317d0c4e45438d169b1"} Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.437215 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.438681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"46315eac-b29e-48fa-864d-f105eefd2fc3","Type":"ContainerStarted","Data":"6b0c55ff26284b0a0af890fb3d0cbab811242708e4f4047404eead760e530659"} Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.439410 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.441950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" event={"ID":"07c812b5-85e1-4e57-bbf8-139a0a4e71d0","Type":"ContainerStarted","Data":"e453b33804409c9bbb58cfcd51f05c03dee6ee5e4aca100872a7861129b4b162"} Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.471183 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.471161686 podStartE2EDuration="40.471161686s" podCreationTimestamp="2025-12-01 10:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:52.466375302 +0000 UTC m=+1289.700846220" watchObservedRunningTime="2025-12-01 10:52:52.471161686 +0000 UTC m=+1289.705632594" Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.539698 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.539677779 podStartE2EDuration="39.539677779s" podCreationTimestamp="2025-12-01 10:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:52:52.501409159 +0000 UTC m=+1289.735880077" watchObservedRunningTime="2025-12-01 10:52:52.539677779 +0000 UTC m=+1289.774148677" Dec 01 10:52:52 crc kubenswrapper[4909]: I1201 10:52:52.544952 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" podStartSLOduration=1.9559311209999999 podStartE2EDuration="10.544943018s" podCreationTimestamp="2025-12-01 10:52:42 +0000 UTC" firstStartedPulling="2025-12-01 10:52:43.414988006 +0000 UTC m=+1280.649458904" lastFinishedPulling="2025-12-01 10:52:52.003999903 +0000 UTC m=+1289.238470801" observedRunningTime="2025-12-01 10:52:52.524861773 +0000 UTC m=+1289.759332671" watchObservedRunningTime="2025-12-01 10:52:52.544943018 +0000 UTC m=+1289.779413916" Dec 01 10:53:02 crc kubenswrapper[4909]: I1201 10:53:02.451481 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 10:53:03 crc kubenswrapper[4909]: I1201 10:53:03.416153 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:53:04 crc kubenswrapper[4909]: I1201 10:53:04.555015 4909 generic.go:334] "Generic (PLEG): container finished" podID="07c812b5-85e1-4e57-bbf8-139a0a4e71d0" containerID="e453b33804409c9bbb58cfcd51f05c03dee6ee5e4aca100872a7861129b4b162" exitCode=0 Dec 01 10:53:04 crc kubenswrapper[4909]: I1201 10:53:04.555101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" event={"ID":"07c812b5-85e1-4e57-bbf8-139a0a4e71d0","Type":"ContainerDied","Data":"e453b33804409c9bbb58cfcd51f05c03dee6ee5e4aca100872a7861129b4b162"} Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.008253 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.099035 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key\") pod \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.099117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474nn\" (UniqueName: \"kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn\") pod \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.099148 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle\") pod \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.099283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory\") pod \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\" (UID: \"07c812b5-85e1-4e57-bbf8-139a0a4e71d0\") " Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.105198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "07c812b5-85e1-4e57-bbf8-139a0a4e71d0" (UID: "07c812b5-85e1-4e57-bbf8-139a0a4e71d0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.105980 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn" (OuterVolumeSpecName: "kube-api-access-474nn") pod "07c812b5-85e1-4e57-bbf8-139a0a4e71d0" (UID: "07c812b5-85e1-4e57-bbf8-139a0a4e71d0"). InnerVolumeSpecName "kube-api-access-474nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.126961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "07c812b5-85e1-4e57-bbf8-139a0a4e71d0" (UID: "07c812b5-85e1-4e57-bbf8-139a0a4e71d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.128154 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory" (OuterVolumeSpecName: "inventory") pod "07c812b5-85e1-4e57-bbf8-139a0a4e71d0" (UID: "07c812b5-85e1-4e57-bbf8-139a0a4e71d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.194184 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.194557 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.202275 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.202319 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-474nn\" (UniqueName: \"kubernetes.io/projected/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-kube-api-access-474nn\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.202338 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.202352 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07c812b5-85e1-4e57-bbf8-139a0a4e71d0-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.574714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" event={"ID":"07c812b5-85e1-4e57-bbf8-139a0a4e71d0","Type":"ContainerDied","Data":"96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e"} Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.574778 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d31d83ce703aaacd9b3ba688241fd614c864f3fdd8656929d0baa0391ea79e" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.574779 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.663694 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k"] Dec 01 10:53:06 crc kubenswrapper[4909]: E1201 10:53:06.665456 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c812b5-85e1-4e57-bbf8-139a0a4e71d0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.665483 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c812b5-85e1-4e57-bbf8-139a0a4e71d0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.665689 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c812b5-85e1-4e57-bbf8-139a0a4e71d0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.666378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.677170 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k"] Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.711386 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.711505 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.711386 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.711609 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.814633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.814679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.814721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.814798 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snr9j\" (UniqueName: \"kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.916783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snr9j\" (UniqueName: \"kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.917459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.917667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.917865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.922600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.922726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.926506 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:06 crc kubenswrapper[4909]: I1201 10:53:06.934504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snr9j\" (UniqueName: \"kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:07 crc kubenswrapper[4909]: I1201 10:53:07.036614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:53:07 crc kubenswrapper[4909]: W1201 10:53:07.544202 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfce05ed6_3691_4184_9a55_9dfac30486cf.slice/crio-655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52 WatchSource:0}: Error finding container 655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52: Status 404 returned error can't find the container with id 655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52 Dec 01 10:53:07 crc kubenswrapper[4909]: I1201 10:53:07.544696 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k"] Dec 01 10:53:07 crc kubenswrapper[4909]: I1201 10:53:07.585591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" event={"ID":"fce05ed6-3691-4184-9a55-9dfac30486cf","Type":"ContainerStarted","Data":"655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52"} Dec 01 10:53:08 crc kubenswrapper[4909]: I1201 10:53:08.596791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" event={"ID":"fce05ed6-3691-4184-9a55-9dfac30486cf","Type":"ContainerStarted","Data":"c42a594113560ab1346b41acbaddb119f4b00f5b16804eac70747d795260e7a4"} Dec 01 10:53:08 crc kubenswrapper[4909]: I1201 10:53:08.618221 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" podStartSLOduration=2.111378696 podStartE2EDuration="2.618204s" podCreationTimestamp="2025-12-01 10:53:06 +0000 UTC" firstStartedPulling="2025-12-01 10:53:07.54821856 +0000 UTC m=+1304.782689458" lastFinishedPulling="2025-12-01 10:53:08.055043864 +0000 UTC m=+1305.289514762" observedRunningTime="2025-12-01 10:53:08.617526548 +0000 UTC m=+1305.851997466" watchObservedRunningTime="2025-12-01 10:53:08.618204 +0000 UTC m=+1305.852674908" Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.193444 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.194062 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.194108 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.194868 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.194941 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95" gracePeriod=600 Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.862830 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95" exitCode=0 Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.862910 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95"} Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.863279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb"} Dec 01 10:53:36 crc kubenswrapper[4909]: I1201 10:53:36.863318 4909 scope.go:117] "RemoveContainer" containerID="8100ded86185432844121910234322762069105ef0bb9776e57888f1149baba1" Dec 01 10:53:39 crc kubenswrapper[4909]: I1201 10:53:39.077929 4909 scope.go:117] "RemoveContainer" containerID="1fd7a3d90828bd95639b5ee25e3a089db7f86661ebc36922f2c526dfbe60a8c7" Dec 01 10:53:39 crc kubenswrapper[4909]: I1201 10:53:39.098791 4909 scope.go:117] "RemoveContainer" containerID="beeec8123cfad4adc9e251a33355dd05a7a2192e83ad3f4b7a376ed36474ce5c" Dec 01 10:54:39 crc kubenswrapper[4909]: I1201 10:54:39.182911 4909 scope.go:117] "RemoveContainer" containerID="c8df12725e92ea8b98a39c1775644a76f065b526f47c544c9f7ac33438dfadc4" Dec 01 10:54:39 crc kubenswrapper[4909]: I1201 10:54:39.227622 4909 scope.go:117] "RemoveContainer" containerID="ed099a35ee29271cbf59128ca6a1e18c44d89d0d2fc53355c26f026f9c9ddf2f" Dec 01 10:55:36 crc kubenswrapper[4909]: I1201 10:55:36.194081 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:55:36 crc kubenswrapper[4909]: I1201 10:55:36.194609 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:06 crc kubenswrapper[4909]: I1201 10:56:06.193824 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:56:06 crc kubenswrapper[4909]: I1201 10:56:06.194589 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:13 crc kubenswrapper[4909]: I1201 10:56:13.777492 4909 generic.go:334] "Generic (PLEG): container finished" podID="fce05ed6-3691-4184-9a55-9dfac30486cf" containerID="c42a594113560ab1346b41acbaddb119f4b00f5b16804eac70747d795260e7a4" exitCode=0 Dec 01 10:56:13 crc kubenswrapper[4909]: I1201 10:56:13.777572 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" event={"ID":"fce05ed6-3691-4184-9a55-9dfac30486cf","Type":"ContainerDied","Data":"c42a594113560ab1346b41acbaddb119f4b00f5b16804eac70747d795260e7a4"} Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.223213 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.353690 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snr9j\" (UniqueName: \"kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j\") pod \"fce05ed6-3691-4184-9a55-9dfac30486cf\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.353953 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory\") pod \"fce05ed6-3691-4184-9a55-9dfac30486cf\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.353986 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key\") pod \"fce05ed6-3691-4184-9a55-9dfac30486cf\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.354026 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle\") pod \"fce05ed6-3691-4184-9a55-9dfac30486cf\" (UID: \"fce05ed6-3691-4184-9a55-9dfac30486cf\") " Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.359805 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j" (OuterVolumeSpecName: "kube-api-access-snr9j") pod "fce05ed6-3691-4184-9a55-9dfac30486cf" (UID: "fce05ed6-3691-4184-9a55-9dfac30486cf"). InnerVolumeSpecName "kube-api-access-snr9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.360765 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fce05ed6-3691-4184-9a55-9dfac30486cf" (UID: "fce05ed6-3691-4184-9a55-9dfac30486cf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.382818 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory" (OuterVolumeSpecName: "inventory") pod "fce05ed6-3691-4184-9a55-9dfac30486cf" (UID: "fce05ed6-3691-4184-9a55-9dfac30486cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.400358 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fce05ed6-3691-4184-9a55-9dfac30486cf" (UID: "fce05ed6-3691-4184-9a55-9dfac30486cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.456965 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.457425 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.457447 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce05ed6-3691-4184-9a55-9dfac30486cf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.457458 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snr9j\" (UniqueName: \"kubernetes.io/projected/fce05ed6-3691-4184-9a55-9dfac30486cf-kube-api-access-snr9j\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.798112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" event={"ID":"fce05ed6-3691-4184-9a55-9dfac30486cf","Type":"ContainerDied","Data":"655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52"} Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.798167 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655ce4185fee782d23272d06ea8a258b36350d87dcac616907afb6d54bd7ea52" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.798223 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.911135 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6"] Dec 01 10:56:15 crc kubenswrapper[4909]: E1201 10:56:15.911552 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce05ed6-3691-4184-9a55-9dfac30486cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.911572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce05ed6-3691-4184-9a55-9dfac30486cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.911746 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce05ed6-3691-4184-9a55-9dfac30486cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.912407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.914829 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.914867 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.915039 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.916195 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.936308 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6"] Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.970714 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.970799 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59d6\" (UniqueName: \"kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:15 crc kubenswrapper[4909]: I1201 10:56:15.970930 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.073195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.073301 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.073337 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59d6\" (UniqueName: \"kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.089986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.090024 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.093378 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59d6\" (UniqueName: \"kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.234459 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.760625 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6"] Dec 01 10:56:16 crc kubenswrapper[4909]: I1201 10:56:16.806970 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" event={"ID":"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1","Type":"ContainerStarted","Data":"134708b58c62e0969c96271fb8a0c6298c05139aeb98fb511267c0258006a749"} Dec 01 10:56:17 crc kubenswrapper[4909]: I1201 10:56:17.819429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" event={"ID":"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1","Type":"ContainerStarted","Data":"d16a6b80c9b1c456d78308d24b546609cb93201bbbf726c3b7d1ca05e6f763cc"} Dec 01 10:56:17 crc kubenswrapper[4909]: I1201 10:56:17.845857 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" podStartSLOduration=2.303687263 podStartE2EDuration="2.845839202s" podCreationTimestamp="2025-12-01 10:56:15 +0000 UTC" firstStartedPulling="2025-12-01 10:56:16.764817476 +0000 UTC m=+1493.999288374" lastFinishedPulling="2025-12-01 10:56:17.306969415 +0000 UTC m=+1494.541440313" observedRunningTime="2025-12-01 10:56:17.834771231 +0000 UTC m=+1495.069242139" watchObservedRunningTime="2025-12-01 10:56:17.845839202 +0000 UTC m=+1495.080310100" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.206049 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.211629 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.220762 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.263173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.263349 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.263375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2qv\" (UniqueName: \"kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.364956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.365033 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2qv\" (UniqueName: \"kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.365110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.366116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.366356 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.399628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2qv\" (UniqueName: \"kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv\") pod \"community-operators-q4w6w\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:20 crc kubenswrapper[4909]: I1201 10:56:20.539807 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:21 crc kubenswrapper[4909]: I1201 10:56:21.110764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:21 crc kubenswrapper[4909]: I1201 10:56:21.861259 4909 generic.go:334] "Generic (PLEG): container finished" podID="98de3136-445f-4f69-b845-74ddb464d265" containerID="8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b" exitCode=0 Dec 01 10:56:21 crc kubenswrapper[4909]: I1201 10:56:21.861468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerDied","Data":"8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b"} Dec 01 10:56:21 crc kubenswrapper[4909]: I1201 10:56:21.861550 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerStarted","Data":"5459f3b57ac922e30fbb3e08fd237cd5f57c73b28ccfaa665f65a38c48fcacab"} Dec 01 10:56:23 crc kubenswrapper[4909]: I1201 10:56:23.880702 4909 generic.go:334] "Generic (PLEG): container finished" podID="98de3136-445f-4f69-b845-74ddb464d265" containerID="10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47" exitCode=0 Dec 01 10:56:23 crc kubenswrapper[4909]: I1201 10:56:23.880770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerDied","Data":"10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47"} Dec 01 10:56:24 crc kubenswrapper[4909]: I1201 10:56:24.891791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerStarted","Data":"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a"} Dec 01 10:56:24 crc kubenswrapper[4909]: I1201 10:56:24.912754 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q4w6w" podStartSLOduration=2.499761982 podStartE2EDuration="4.912732923s" podCreationTimestamp="2025-12-01 10:56:20 +0000 UTC" firstStartedPulling="2025-12-01 10:56:21.864819222 +0000 UTC m=+1499.099290120" lastFinishedPulling="2025-12-01 10:56:24.277790173 +0000 UTC m=+1501.512261061" observedRunningTime="2025-12-01 10:56:24.910304598 +0000 UTC m=+1502.144775506" watchObservedRunningTime="2025-12-01 10:56:24.912732923 +0000 UTC m=+1502.147203831" Dec 01 10:56:30 crc kubenswrapper[4909]: I1201 10:56:30.540564 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:30 crc kubenswrapper[4909]: I1201 10:56:30.541269 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:30 crc kubenswrapper[4909]: I1201 10:56:30.602125 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:30 crc kubenswrapper[4909]: I1201 10:56:30.994140 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:31 crc kubenswrapper[4909]: I1201 10:56:31.049121 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:32 crc kubenswrapper[4909]: I1201 10:56:32.962027 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q4w6w" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="registry-server" containerID="cri-o://9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a" gracePeriod=2 Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.416739 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.519710 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb2qv\" (UniqueName: \"kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv\") pod \"98de3136-445f-4f69-b845-74ddb464d265\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.520163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content\") pod \"98de3136-445f-4f69-b845-74ddb464d265\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.520368 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities\") pod \"98de3136-445f-4f69-b845-74ddb464d265\" (UID: \"98de3136-445f-4f69-b845-74ddb464d265\") " Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.521129 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities" (OuterVolumeSpecName: "utilities") pod "98de3136-445f-4f69-b845-74ddb464d265" (UID: "98de3136-445f-4f69-b845-74ddb464d265"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.528047 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv" (OuterVolumeSpecName: "kube-api-access-pb2qv") pod "98de3136-445f-4f69-b845-74ddb464d265" (UID: "98de3136-445f-4f69-b845-74ddb464d265"). InnerVolumeSpecName "kube-api-access-pb2qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.579971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98de3136-445f-4f69-b845-74ddb464d265" (UID: "98de3136-445f-4f69-b845-74ddb464d265"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.622525 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.622605 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb2qv\" (UniqueName: \"kubernetes.io/projected/98de3136-445f-4f69-b845-74ddb464d265-kube-api-access-pb2qv\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.622623 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98de3136-445f-4f69-b845-74ddb464d265-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.973154 4909 generic.go:334] "Generic (PLEG): container finished" podID="98de3136-445f-4f69-b845-74ddb464d265" containerID="9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a" exitCode=0 Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.973201 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerDied","Data":"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a"} Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.973254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4w6w" event={"ID":"98de3136-445f-4f69-b845-74ddb464d265","Type":"ContainerDied","Data":"5459f3b57ac922e30fbb3e08fd237cd5f57c73b28ccfaa665f65a38c48fcacab"} Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.973275 4909 scope.go:117] "RemoveContainer" containerID="9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a" Dec 01 10:56:33 crc kubenswrapper[4909]: I1201 10:56:33.973518 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4w6w" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.009334 4909 scope.go:117] "RemoveContainer" containerID="10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.009975 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.020703 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q4w6w"] Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.035412 4909 scope.go:117] "RemoveContainer" containerID="8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.074989 4909 scope.go:117] "RemoveContainer" containerID="9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a" Dec 01 10:56:34 crc kubenswrapper[4909]: E1201 10:56:34.075273 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a\": container with ID starting with 9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a not found: ID does not exist" containerID="9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.075306 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a"} err="failed to get container status \"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a\": rpc error: code = NotFound desc = could not find container \"9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a\": container with ID starting with 9897059ded36463ebf1207f3386dddaed4afa24fba459bdc3cad4bcee22a5c7a not found: ID does not exist" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.075366 4909 scope.go:117] "RemoveContainer" containerID="10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47" Dec 01 10:56:34 crc kubenswrapper[4909]: E1201 10:56:34.075646 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47\": container with ID starting with 10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47 not found: ID does not exist" containerID="10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.075683 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47"} err="failed to get container status \"10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47\": rpc error: code = NotFound desc = could not find container \"10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47\": container with ID starting with 10c1beb35cc89e8b500371c10a7d6b4093e1d7709216fec8b26509c8f7ee2b47 not found: ID does not exist" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.075696 4909 scope.go:117] "RemoveContainer" containerID="8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b" Dec 01 10:56:34 crc kubenswrapper[4909]: E1201 10:56:34.075962 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b\": container with ID starting with 8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b not found: ID does not exist" containerID="8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b" Dec 01 10:56:34 crc kubenswrapper[4909]: I1201 10:56:34.076027 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b"} err="failed to get container status \"8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b\": rpc error: code = NotFound desc = could not find container \"8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b\": container with ID starting with 8bad278cc3aec7c518360399028cb3dd0178a08365bf02bf9e31ca4f7ffc0f6b not found: ID does not exist" Dec 01 10:56:35 crc kubenswrapper[4909]: I1201 10:56:35.272469 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98de3136-445f-4f69-b845-74ddb464d265" path="/var/lib/kubelet/pods/98de3136-445f-4f69-b845-74ddb464d265/volumes" Dec 01 10:56:36 crc kubenswrapper[4909]: I1201 10:56:36.193283 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:56:36 crc kubenswrapper[4909]: I1201 10:56:36.193339 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:36 crc kubenswrapper[4909]: I1201 10:56:36.193385 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 10:56:36 crc kubenswrapper[4909]: I1201 10:56:36.194140 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:56:36 crc kubenswrapper[4909]: I1201 10:56:36.194216 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" gracePeriod=600 Dec 01 10:56:36 crc kubenswrapper[4909]: E1201 10:56:36.316374 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:56:37 crc kubenswrapper[4909]: I1201 10:56:37.006262 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" exitCode=0 Dec 01 10:56:37 crc kubenswrapper[4909]: I1201 10:56:37.006319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb"} Dec 01 10:56:37 crc kubenswrapper[4909]: I1201 10:56:37.006391 4909 scope.go:117] "RemoveContainer" containerID="6aa807195832fa8b3d5986bee6241afe4be3be05e68f4c945ec2b1d547d17a95" Dec 01 10:56:37 crc kubenswrapper[4909]: I1201 10:56:37.007160 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:56:37 crc kubenswrapper[4909]: E1201 10:56:37.007464 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.386264 4909 scope.go:117] "RemoveContainer" containerID="ee0e246e2aa5f7681950d03ef0bc3765d8607a071ec4f2f64e996a6092815529" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.408986 4909 scope.go:117] "RemoveContainer" containerID="c54ec3b4fa9c6e47dce4e5a815f494905aa2c56fbab9adc52920a2d79dc18d1a" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.427520 4909 scope.go:117] "RemoveContainer" containerID="e8799d07babc2b86da86ba8d7220b7c972036b9b65f45b2387bdb92657b227ea" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.446319 4909 scope.go:117] "RemoveContainer" containerID="bcbf7ef8a1a283205a204880e9bb988a4c899b693044b726582178e91e44bc26" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.463276 4909 scope.go:117] "RemoveContainer" containerID="70c3db63cada5186ae6ffca11add9dee9cb7ae0be9e164199e3dc20b2d990e50" Dec 01 10:56:39 crc kubenswrapper[4909]: I1201 10:56:39.485233 4909 scope.go:117] "RemoveContainer" containerID="2f30d210612764c2f454c0c7a932a4406a8b86e4f16433529990a56ac7d87309" Dec 01 10:56:51 crc kubenswrapper[4909]: I1201 10:56:51.258950 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:56:51 crc kubenswrapper[4909]: E1201 10:56:51.260067 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:57:02 crc kubenswrapper[4909]: I1201 10:57:02.258118 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:57:02 crc kubenswrapper[4909]: E1201 10:57:02.259223 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:57:14 crc kubenswrapper[4909]: I1201 10:57:14.256936 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:57:14 crc kubenswrapper[4909]: E1201 10:57:14.257617 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.909213 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p4vb"] Dec 01 10:57:22 crc kubenswrapper[4909]: E1201 10:57:22.914160 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="registry-server" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.914197 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="registry-server" Dec 01 10:57:22 crc kubenswrapper[4909]: E1201 10:57:22.914220 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="extract-utilities" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.914234 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="extract-utilities" Dec 01 10:57:22 crc kubenswrapper[4909]: E1201 10:57:22.914248 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="extract-content" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.914255 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="extract-content" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.914430 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98de3136-445f-4f69-b845-74ddb464d265" containerName="registry-server" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.915985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:22 crc kubenswrapper[4909]: I1201 10:57:22.918555 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p4vb"] Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.027987 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-catalog-content\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.028062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-utilities\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.028118 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6scsj\" (UniqueName: \"kubernetes.io/projected/88e81d71-dfc9-44cf-b226-133d99220628-kube-api-access-6scsj\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.130075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-utilities\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.130123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6scsj\" (UniqueName: \"kubernetes.io/projected/88e81d71-dfc9-44cf-b226-133d99220628-kube-api-access-6scsj\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.130271 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-catalog-content\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.130750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-utilities\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.130808 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e81d71-dfc9-44cf-b226-133d99220628-catalog-content\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.154720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6scsj\" (UniqueName: \"kubernetes.io/projected/88e81d71-dfc9-44cf-b226-133d99220628-kube-api-access-6scsj\") pod \"redhat-operators-8p4vb\" (UID: \"88e81d71-dfc9-44cf-b226-133d99220628\") " pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.241104 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:23 crc kubenswrapper[4909]: I1201 10:57:23.746804 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p4vb"] Dec 01 10:57:24 crc kubenswrapper[4909]: I1201 10:57:24.510543 4909 generic.go:334] "Generic (PLEG): container finished" podID="88e81d71-dfc9-44cf-b226-133d99220628" containerID="2d382ad4bcb50ae3c7fb22813a19cc1b0d0641e6833f354126883e6eb2e42675" exitCode=0 Dec 01 10:57:24 crc kubenswrapper[4909]: I1201 10:57:24.510678 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p4vb" event={"ID":"88e81d71-dfc9-44cf-b226-133d99220628","Type":"ContainerDied","Data":"2d382ad4bcb50ae3c7fb22813a19cc1b0d0641e6833f354126883e6eb2e42675"} Dec 01 10:57:24 crc kubenswrapper[4909]: I1201 10:57:24.510919 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p4vb" event={"ID":"88e81d71-dfc9-44cf-b226-133d99220628","Type":"ContainerStarted","Data":"3e8da4a750ceb4d41e1c8483c336ae007081609469a814a8e53ca73466de5349"} Dec 01 10:57:25 crc kubenswrapper[4909]: I1201 10:57:25.521012 4909 generic.go:334] "Generic (PLEG): container finished" podID="eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" containerID="d16a6b80c9b1c456d78308d24b546609cb93201bbbf726c3b7d1ca05e6f763cc" exitCode=0 Dec 01 10:57:25 crc kubenswrapper[4909]: I1201 10:57:25.521191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" event={"ID":"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1","Type":"ContainerDied","Data":"d16a6b80c9b1c456d78308d24b546609cb93201bbbf726c3b7d1ca05e6f763cc"} Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.008559 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.131034 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory\") pod \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.131458 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key\") pod \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.131590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m59d6\" (UniqueName: \"kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6\") pod \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\" (UID: \"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1\") " Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.146285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6" (OuterVolumeSpecName: "kube-api-access-m59d6") pod "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" (UID: "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1"). InnerVolumeSpecName "kube-api-access-m59d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.165796 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory" (OuterVolumeSpecName: "inventory") pod "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" (UID: "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.166842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" (UID: "eac9f2ce-c178-44e9-918e-cf0eacbcc7b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.234029 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.234333 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m59d6\" (UniqueName: \"kubernetes.io/projected/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-kube-api-access-m59d6\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.234350 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.597417 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" event={"ID":"eac9f2ce-c178-44e9-918e-cf0eacbcc7b1","Type":"ContainerDied","Data":"134708b58c62e0969c96271fb8a0c6298c05139aeb98fb511267c0258006a749"} Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.597478 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134708b58c62e0969c96271fb8a0c6298c05139aeb98fb511267c0258006a749" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.597572 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.642388 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w"] Dec 01 10:57:27 crc kubenswrapper[4909]: E1201 10:57:27.642955 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.642976 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.643261 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.644109 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.648488 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.650110 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.650510 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.650759 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.658062 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w"] Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.751448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.751549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.751625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnnnl\" (UniqueName: \"kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.853385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnnnl\" (UniqueName: \"kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.853560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.853646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.860423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.860518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.876864 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnnnl\" (UniqueName: \"kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:27 crc kubenswrapper[4909]: I1201 10:57:27.970097 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:28 crc kubenswrapper[4909]: I1201 10:57:28.579336 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w"] Dec 01 10:57:28 crc kubenswrapper[4909]: W1201 10:57:28.586119 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60d8cc9d_aaa3_4244_9cd2_c384293f0328.slice/crio-d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a WatchSource:0}: Error finding container d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a: Status 404 returned error can't find the container with id d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a Dec 01 10:57:28 crc kubenswrapper[4909]: I1201 10:57:28.609728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" event={"ID":"60d8cc9d-aaa3-4244-9cd2-c384293f0328","Type":"ContainerStarted","Data":"d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a"} Dec 01 10:57:29 crc kubenswrapper[4909]: I1201 10:57:29.257535 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:57:29 crc kubenswrapper[4909]: E1201 10:57:29.258329 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:57:35 crc kubenswrapper[4909]: I1201 10:57:35.690695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" event={"ID":"60d8cc9d-aaa3-4244-9cd2-c384293f0328","Type":"ContainerStarted","Data":"43fb82b9f45c471b97e34fa64c47077ef31b6ecfb74d5a31e2c7fe0562b1116b"} Dec 01 10:57:35 crc kubenswrapper[4909]: I1201 10:57:35.693563 4909 generic.go:334] "Generic (PLEG): container finished" podID="88e81d71-dfc9-44cf-b226-133d99220628" containerID="44f20c9c08e8b462d540460e1edd124e8dabad86ec63a50571bc57bba33f1d59" exitCode=0 Dec 01 10:57:35 crc kubenswrapper[4909]: I1201 10:57:35.693622 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p4vb" event={"ID":"88e81d71-dfc9-44cf-b226-133d99220628","Type":"ContainerDied","Data":"44f20c9c08e8b462d540460e1edd124e8dabad86ec63a50571bc57bba33f1d59"} Dec 01 10:57:35 crc kubenswrapper[4909]: I1201 10:57:35.747993 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" podStartSLOduration=2.769262231 podStartE2EDuration="8.747971046s" podCreationTimestamp="2025-12-01 10:57:27 +0000 UTC" firstStartedPulling="2025-12-01 10:57:28.588725326 +0000 UTC m=+1565.823196224" lastFinishedPulling="2025-12-01 10:57:34.567434141 +0000 UTC m=+1571.801905039" observedRunningTime="2025-12-01 10:57:35.714285217 +0000 UTC m=+1572.948756115" watchObservedRunningTime="2025-12-01 10:57:35.747971046 +0000 UTC m=+1572.982441944" Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.040670 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hlldf"] Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.049158 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2905-account-create-update-v7wdd"] Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.059032 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hlldf"] Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.069375 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2905-account-create-update-v7wdd"] Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.714016 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p4vb" event={"ID":"88e81d71-dfc9-44cf-b226-133d99220628","Type":"ContainerStarted","Data":"6fb1bc8ffca3c4ecd30e552f0b0980a4d11739a9097046989e49a50d7e9304ba"} Dec 01 10:57:36 crc kubenswrapper[4909]: I1201 10:57:36.736649 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p4vb" podStartSLOduration=2.877255751 podStartE2EDuration="14.736633475s" podCreationTimestamp="2025-12-01 10:57:22 +0000 UTC" firstStartedPulling="2025-12-01 10:57:24.513022959 +0000 UTC m=+1561.747493857" lastFinishedPulling="2025-12-01 10:57:36.372400683 +0000 UTC m=+1573.606871581" observedRunningTime="2025-12-01 10:57:36.732022262 +0000 UTC m=+1573.966493170" watchObservedRunningTime="2025-12-01 10:57:36.736633475 +0000 UTC m=+1573.971104373" Dec 01 10:57:37 crc kubenswrapper[4909]: I1201 10:57:37.270166 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384e1871-1564-43db-87f7-522394755854" path="/var/lib/kubelet/pods/384e1871-1564-43db-87f7-522394755854/volumes" Dec 01 10:57:37 crc kubenswrapper[4909]: I1201 10:57:37.270816 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411e6d91-2c4d-44a0-94ed-09347706bc05" path="/var/lib/kubelet/pods/411e6d91-2c4d-44a0-94ed-09347706bc05/volumes" Dec 01 10:57:39 crc kubenswrapper[4909]: I1201 10:57:39.567845 4909 scope.go:117] "RemoveContainer" containerID="c2cfc4149410bf7412755a0a3ec28e44bac6a0bd97ef8ec5c1164b1ed29f18fa" Dec 01 10:57:39 crc kubenswrapper[4909]: I1201 10:57:39.602202 4909 scope.go:117] "RemoveContainer" containerID="bc7a3ac1ad562ff021dd6fb836dfaf3d6e6849078d152d1703cafc764afe29e5" Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.034278 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p56hk"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.045089 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p56hk"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.053859 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-def2-account-create-update-4n7lx"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.070583 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jt8fv"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.078700 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4143-account-create-update-9v4tm"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.087478 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-def2-account-create-update-4n7lx"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.097512 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jt8fv"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.107712 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4143-account-create-update-9v4tm"] Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.767027 4909 generic.go:334] "Generic (PLEG): container finished" podID="60d8cc9d-aaa3-4244-9cd2-c384293f0328" containerID="43fb82b9f45c471b97e34fa64c47077ef31b6ecfb74d5a31e2c7fe0562b1116b" exitCode=0 Dec 01 10:57:40 crc kubenswrapper[4909]: I1201 10:57:40.767126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" event={"ID":"60d8cc9d-aaa3-4244-9cd2-c384293f0328","Type":"ContainerDied","Data":"43fb82b9f45c471b97e34fa64c47077ef31b6ecfb74d5a31e2c7fe0562b1116b"} Dec 01 10:57:41 crc kubenswrapper[4909]: I1201 10:57:41.271320 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33db1774-4b90-4cba-be63-744f8b79f29c" path="/var/lib/kubelet/pods/33db1774-4b90-4cba-be63-744f8b79f29c/volumes" Dec 01 10:57:41 crc kubenswrapper[4909]: I1201 10:57:41.272001 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f21bdf9-c58c-4b57-b3fe-f64338c83e32" path="/var/lib/kubelet/pods/8f21bdf9-c58c-4b57-b3fe-f64338c83e32/volumes" Dec 01 10:57:41 crc kubenswrapper[4909]: I1201 10:57:41.272575 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9510b7-d3ab-4508-bf7e-968d2d3b684c" path="/var/lib/kubelet/pods/9f9510b7-d3ab-4508-bf7e-968d2d3b684c/volumes" Dec 01 10:57:41 crc kubenswrapper[4909]: I1201 10:57:41.273615 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2c338d-8319-457b-ad3b-cbf51df55d8a" path="/var/lib/kubelet/pods/af2c338d-8319-457b-ad3b-cbf51df55d8a/volumes" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.266957 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.391042 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key\") pod \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.391100 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory\") pod \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.391168 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnnnl\" (UniqueName: \"kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl\") pod \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\" (UID: \"60d8cc9d-aaa3-4244-9cd2-c384293f0328\") " Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.397472 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl" (OuterVolumeSpecName: "kube-api-access-cnnnl") pod "60d8cc9d-aaa3-4244-9cd2-c384293f0328" (UID: "60d8cc9d-aaa3-4244-9cd2-c384293f0328"). InnerVolumeSpecName "kube-api-access-cnnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.423240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "60d8cc9d-aaa3-4244-9cd2-c384293f0328" (UID: "60d8cc9d-aaa3-4244-9cd2-c384293f0328"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.426157 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory" (OuterVolumeSpecName: "inventory") pod "60d8cc9d-aaa3-4244-9cd2-c384293f0328" (UID: "60d8cc9d-aaa3-4244-9cd2-c384293f0328"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.493931 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.493974 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d8cc9d-aaa3-4244-9cd2-c384293f0328-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.493987 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnnnl\" (UniqueName: \"kubernetes.io/projected/60d8cc9d-aaa3-4244-9cd2-c384293f0328-kube-api-access-cnnnl\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.807193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" event={"ID":"60d8cc9d-aaa3-4244-9cd2-c384293f0328","Type":"ContainerDied","Data":"d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a"} Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.807255 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d06a25907e4d9f65534bb8f2cab7a33f345723723fb1f6bd727fdd63a33a307a" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.807374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.872244 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d"] Dec 01 10:57:42 crc kubenswrapper[4909]: E1201 10:57:42.873090 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d8cc9d-aaa3-4244-9cd2-c384293f0328" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.873180 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d8cc9d-aaa3-4244-9cd2-c384293f0328" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.873492 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d8cc9d-aaa3-4244-9cd2-c384293f0328" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.874463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.877837 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.878465 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.878702 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.878956 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:57:42 crc kubenswrapper[4909]: I1201 10:57:42.884142 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d"] Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.004274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6dx\" (UniqueName: \"kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.004376 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.004800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.106603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6dx\" (UniqueName: \"kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.106716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.106803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.112532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.113033 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.124137 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6dx\" (UniqueName: \"kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dqq4d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.202083 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.241376 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.242040 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.272038 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:57:43 crc kubenswrapper[4909]: E1201 10:57:43.273391 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.307747 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.745962 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d"] Dec 01 10:57:43 crc kubenswrapper[4909]: W1201 10:57:43.748052 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc539a3fa_4b2a_4a11_91ab_9996e6c0c99d.slice/crio-15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc WatchSource:0}: Error finding container 15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc: Status 404 returned error can't find the container with id 15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.752094 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.817518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" event={"ID":"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d","Type":"ContainerStarted","Data":"15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc"} Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.875726 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p4vb" Dec 01 10:57:43 crc kubenswrapper[4909]: I1201 10:57:43.956128 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p4vb"] Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.018224 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.018473 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z5f7f" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="registry-server" containerID="cri-o://8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33" gracePeriod=2 Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.622845 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.742047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2xzg\" (UniqueName: \"kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg\") pod \"2345b837-4d7d-4d0a-b834-cb26782887ed\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.742483 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content\") pod \"2345b837-4d7d-4d0a-b834-cb26782887ed\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.742528 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities\") pod \"2345b837-4d7d-4d0a-b834-cb26782887ed\" (UID: \"2345b837-4d7d-4d0a-b834-cb26782887ed\") " Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.743399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities" (OuterVolumeSpecName: "utilities") pod "2345b837-4d7d-4d0a-b834-cb26782887ed" (UID: "2345b837-4d7d-4d0a-b834-cb26782887ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.762031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg" (OuterVolumeSpecName: "kube-api-access-f2xzg") pod "2345b837-4d7d-4d0a-b834-cb26782887ed" (UID: "2345b837-4d7d-4d0a-b834-cb26782887ed"). InnerVolumeSpecName "kube-api-access-f2xzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.844098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" event={"ID":"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d","Type":"ContainerStarted","Data":"a1725f9c2d3e35ac72dde2cc691b22aef77a2f3c6f49c86d579cf5f58ffec315"} Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.845433 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2xzg\" (UniqueName: \"kubernetes.io/projected/2345b837-4d7d-4d0a-b834-cb26782887ed-kube-api-access-f2xzg\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.845464 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.849137 4909 generic.go:334] "Generic (PLEG): container finished" podID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerID="8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33" exitCode=0 Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.849345 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5f7f" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.849796 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerDied","Data":"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33"} Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.849847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5f7f" event={"ID":"2345b837-4d7d-4d0a-b834-cb26782887ed","Type":"ContainerDied","Data":"f49e6be80f4312219fbe0ce2f69b1da86410acd2d4cf1801c4687b1b3e4d63c7"} Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.849871 4909 scope.go:117] "RemoveContainer" containerID="8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.855681 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2345b837-4d7d-4d0a-b834-cb26782887ed" (UID: "2345b837-4d7d-4d0a-b834-cb26782887ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.887849 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" podStartSLOduration=2.28969783 podStartE2EDuration="2.887830825s" podCreationTimestamp="2025-12-01 10:57:42 +0000 UTC" firstStartedPulling="2025-12-01 10:57:43.751789011 +0000 UTC m=+1580.986259909" lastFinishedPulling="2025-12-01 10:57:44.349921996 +0000 UTC m=+1581.584392904" observedRunningTime="2025-12-01 10:57:44.868962103 +0000 UTC m=+1582.103433001" watchObservedRunningTime="2025-12-01 10:57:44.887830825 +0000 UTC m=+1582.122301723" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.889087 4909 scope.go:117] "RemoveContainer" containerID="bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.922776 4909 scope.go:117] "RemoveContainer" containerID="ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.947587 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2345b837-4d7d-4d0a-b834-cb26782887ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.956167 4909 scope.go:117] "RemoveContainer" containerID="8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33" Dec 01 10:57:44 crc kubenswrapper[4909]: E1201 10:57:44.958347 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33\": container with ID starting with 8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33 not found: ID does not exist" containerID="8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.958408 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33"} err="failed to get container status \"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33\": rpc error: code = NotFound desc = could not find container \"8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33\": container with ID starting with 8aa72338339d9218c59b3b7781c85a8743a67ff7d514dc34528d9f8d90f36d33 not found: ID does not exist" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.958440 4909 scope.go:117] "RemoveContainer" containerID="bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2" Dec 01 10:57:44 crc kubenswrapper[4909]: E1201 10:57:44.959335 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2\": container with ID starting with bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2 not found: ID does not exist" containerID="bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.959364 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2"} err="failed to get container status \"bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2\": rpc error: code = NotFound desc = could not find container \"bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2\": container with ID starting with bab7a2e0ccdea4c4b6752456a916bc7c2d423066f3ba2ca935f0e72ab4ef6df2 not found: ID does not exist" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.959387 4909 scope.go:117] "RemoveContainer" containerID="ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d" Dec 01 10:57:44 crc kubenswrapper[4909]: E1201 10:57:44.959714 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d\": container with ID starting with ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d not found: ID does not exist" containerID="ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d" Dec 01 10:57:44 crc kubenswrapper[4909]: I1201 10:57:44.959736 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d"} err="failed to get container status \"ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d\": rpc error: code = NotFound desc = could not find container \"ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d\": container with ID starting with ad57e0e84a5c85d965eb8bafc30362509db4ddc97c5b3d6dba7a0a3b24221d6d not found: ID does not exist" Dec 01 10:57:45 crc kubenswrapper[4909]: I1201 10:57:45.184804 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:57:45 crc kubenswrapper[4909]: I1201 10:57:45.202495 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z5f7f"] Dec 01 10:57:45 crc kubenswrapper[4909]: I1201 10:57:45.267864 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" path="/var/lib/kubelet/pods/2345b837-4d7d-4d0a-b834-cb26782887ed/volumes" Dec 01 10:57:56 crc kubenswrapper[4909]: I1201 10:57:56.258249 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:57:56 crc kubenswrapper[4909]: E1201 10:57:56.259134 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:58:02 crc kubenswrapper[4909]: I1201 10:58:02.042788 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xx8p2"] Dec 01 10:58:02 crc kubenswrapper[4909]: I1201 10:58:02.055163 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xx8p2"] Dec 01 10:58:03 crc kubenswrapper[4909]: I1201 10:58:03.267443 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e80149e-1959-4ae7-a8b3-41fc91f45121" path="/var/lib/kubelet/pods/2e80149e-1959-4ae7-a8b3-41fc91f45121/volumes" Dec 01 10:58:09 crc kubenswrapper[4909]: I1201 10:58:09.258103 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:58:09 crc kubenswrapper[4909]: E1201 10:58:09.258925 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.051374 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-z4t9q"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.059226 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8r6cb"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.069234 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cbeb-account-create-update-8lttg"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.076620 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8r6cb"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.084952 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-z4t9q"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.092685 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cbeb-account-create-update-8lttg"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.100991 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3f20-account-create-update-xfjjq"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.112077 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-891a-account-create-update-dpwsn"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.122781 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tb8bv"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.131277 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-891a-account-create-update-dpwsn"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.140332 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tb8bv"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.148904 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3f20-account-create-update-xfjjq"] Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.270332 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae5889f-72cd-4333-9095-93bf13bcdc14" path="/var/lib/kubelet/pods/5ae5889f-72cd-4333-9095-93bf13bcdc14/volumes" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.270991 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4cdf99-e33d-4314-ba4d-3cad05e58712" path="/var/lib/kubelet/pods/5d4cdf99-e33d-4314-ba4d-3cad05e58712/volumes" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.271529 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76721a6a-527e-4c51-8fbb-5a63dfb515a0" path="/var/lib/kubelet/pods/76721a6a-527e-4c51-8fbb-5a63dfb515a0/volumes" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.272084 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ffe604-13b8-4b0e-ba75-ed64c19a9b8e" path="/var/lib/kubelet/pods/96ffe604-13b8-4b0e-ba75-ed64c19a9b8e/volumes" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.273130 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36" path="/var/lib/kubelet/pods/9a51be9c-9fb0-4532-8b3d-1f3d1adc7d36/volumes" Dec 01 10:58:11 crc kubenswrapper[4909]: I1201 10:58:11.273697 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50ed289-5809-492b-a95a-895da6ba0c76" path="/var/lib/kubelet/pods/e50ed289-5809-492b-a95a-895da6ba0c76/volumes" Dec 01 10:58:15 crc kubenswrapper[4909]: I1201 10:58:15.041213 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ffhtn"] Dec 01 10:58:15 crc kubenswrapper[4909]: I1201 10:58:15.048976 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ffhtn"] Dec 01 10:58:15 crc kubenswrapper[4909]: I1201 10:58:15.267274 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0182ce0e-1710-4fdb-b7cd-6143b66d6c0b" path="/var/lib/kubelet/pods/0182ce0e-1710-4fdb-b7cd-6143b66d6c0b/volumes" Dec 01 10:58:23 crc kubenswrapper[4909]: I1201 10:58:23.182050 4909 generic.go:334] "Generic (PLEG): container finished" podID="c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" containerID="a1725f9c2d3e35ac72dde2cc691b22aef77a2f3c6f49c86d579cf5f58ffec315" exitCode=0 Dec 01 10:58:23 crc kubenswrapper[4909]: I1201 10:58:23.182115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" event={"ID":"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d","Type":"ContainerDied","Data":"a1725f9c2d3e35ac72dde2cc691b22aef77a2f3c6f49c86d579cf5f58ffec315"} Dec 01 10:58:23 crc kubenswrapper[4909]: I1201 10:58:23.264459 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:58:23 crc kubenswrapper[4909]: E1201 10:58:23.264733 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.706376 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.870933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key\") pod \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.872115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory\") pod \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.872561 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz6dx\" (UniqueName: \"kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx\") pod \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\" (UID: \"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d\") " Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.878350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx" (OuterVolumeSpecName: "kube-api-access-wz6dx") pod "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" (UID: "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d"). InnerVolumeSpecName "kube-api-access-wz6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.901657 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory" (OuterVolumeSpecName: "inventory") pod "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" (UID: "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.901771 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" (UID: "c539a3fa-4b2a-4a11-91ab-9996e6c0c99d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.976456 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz6dx\" (UniqueName: \"kubernetes.io/projected/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-kube-api-access-wz6dx\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.976505 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:24 crc kubenswrapper[4909]: I1201 10:58:24.976518 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.200240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" event={"ID":"c539a3fa-4b2a-4a11-91ab-9996e6c0c99d","Type":"ContainerDied","Data":"15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc"} Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.200279 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ea661299a90a744c09b6a0e9c8c1d014a5dc49b6ec0c82dacf3cfc29f99bdc" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.200322 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301051 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn"] Dec 01 10:58:25 crc kubenswrapper[4909]: E1201 10:58:25.301527 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301553 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:25 crc kubenswrapper[4909]: E1201 10:58:25.301565 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="extract-utilities" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="extract-utilities" Dec 01 10:58:25 crc kubenswrapper[4909]: E1201 10:58:25.301594 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="registry-server" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301602 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="registry-server" Dec 01 10:58:25 crc kubenswrapper[4909]: E1201 10:58:25.301626 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="extract-content" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301632 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="extract-content" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301838 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2345b837-4d7d-4d0a-b834-cb26782887ed" containerName="registry-server" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.301865 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.302645 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.305694 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.306055 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.306825 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.307039 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.314528 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn"] Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.486781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbv6f\" (UniqueName: \"kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.486843 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.487101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.589856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.590431 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbv6f\" (UniqueName: \"kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.590480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.594603 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.594787 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.606813 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbv6f\" (UniqueName: \"kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:25 crc kubenswrapper[4909]: I1201 10:58:25.618007 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:26 crc kubenswrapper[4909]: I1201 10:58:26.121147 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn"] Dec 01 10:58:26 crc kubenswrapper[4909]: I1201 10:58:26.210502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" event={"ID":"cbc5c9d4-b897-448f-b8eb-6e16dc970df9","Type":"ContainerStarted","Data":"f016bbc6184726de3fac78a7aa0670fb0db0d3657b94df9c55ab12a5a26f9afc"} Dec 01 10:58:27 crc kubenswrapper[4909]: I1201 10:58:27.223934 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" event={"ID":"cbc5c9d4-b897-448f-b8eb-6e16dc970df9","Type":"ContainerStarted","Data":"ae6117ff143fefd8aa362d187bc33f690629319fb96976a1c1db39090f4e2fb2"} Dec 01 10:58:27 crc kubenswrapper[4909]: I1201 10:58:27.253192 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" podStartSLOduration=1.692623311 podStartE2EDuration="2.253164552s" podCreationTimestamp="2025-12-01 10:58:25 +0000 UTC" firstStartedPulling="2025-12-01 10:58:26.12707995 +0000 UTC m=+1623.361550848" lastFinishedPulling="2025-12-01 10:58:26.687621191 +0000 UTC m=+1623.922092089" observedRunningTime="2025-12-01 10:58:27.241598963 +0000 UTC m=+1624.476069891" watchObservedRunningTime="2025-12-01 10:58:27.253164552 +0000 UTC m=+1624.487635450" Dec 01 10:58:31 crc kubenswrapper[4909]: I1201 10:58:31.295874 4909 generic.go:334] "Generic (PLEG): container finished" podID="cbc5c9d4-b897-448f-b8eb-6e16dc970df9" containerID="ae6117ff143fefd8aa362d187bc33f690629319fb96976a1c1db39090f4e2fb2" exitCode=0 Dec 01 10:58:31 crc kubenswrapper[4909]: I1201 10:58:31.295985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" event={"ID":"cbc5c9d4-b897-448f-b8eb-6e16dc970df9","Type":"ContainerDied","Data":"ae6117ff143fefd8aa362d187bc33f690629319fb96976a1c1db39090f4e2fb2"} Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.742039 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.833165 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbv6f\" (UniqueName: \"kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f\") pod \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.833797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory\") pod \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.834252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key\") pod \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\" (UID: \"cbc5c9d4-b897-448f-b8eb-6e16dc970df9\") " Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.841286 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f" (OuterVolumeSpecName: "kube-api-access-hbv6f") pod "cbc5c9d4-b897-448f-b8eb-6e16dc970df9" (UID: "cbc5c9d4-b897-448f-b8eb-6e16dc970df9"). InnerVolumeSpecName "kube-api-access-hbv6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.862610 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory" (OuterVolumeSpecName: "inventory") pod "cbc5c9d4-b897-448f-b8eb-6e16dc970df9" (UID: "cbc5c9d4-b897-448f-b8eb-6e16dc970df9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.863051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbc5c9d4-b897-448f-b8eb-6e16dc970df9" (UID: "cbc5c9d4-b897-448f-b8eb-6e16dc970df9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.937359 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbv6f\" (UniqueName: \"kubernetes.io/projected/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-kube-api-access-hbv6f\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.937399 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:32 crc kubenswrapper[4909]: I1201 10:58:32.937410 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbc5c9d4-b897-448f-b8eb-6e16dc970df9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.314636 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" event={"ID":"cbc5c9d4-b897-448f-b8eb-6e16dc970df9","Type":"ContainerDied","Data":"f016bbc6184726de3fac78a7aa0670fb0db0d3657b94df9c55ab12a5a26f9afc"} Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.314710 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f016bbc6184726de3fac78a7aa0670fb0db0d3657b94df9c55ab12a5a26f9afc" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.314819 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.416958 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2"] Dec 01 10:58:33 crc kubenswrapper[4909]: E1201 10:58:33.417459 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc5c9d4-b897-448f-b8eb-6e16dc970df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.417485 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc5c9d4-b897-448f-b8eb-6e16dc970df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.417780 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc5c9d4-b897-448f-b8eb-6e16dc970df9" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.418635 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.421189 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.421483 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.421749 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.422430 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.427263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2"] Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.549581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.549639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.549920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xgb\" (UniqueName: \"kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.651871 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xgb\" (UniqueName: \"kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.652028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.652067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.657283 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.657358 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.667064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xgb\" (UniqueName: \"kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:33 crc kubenswrapper[4909]: I1201 10:58:33.736417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:58:34 crc kubenswrapper[4909]: I1201 10:58:34.285609 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2"] Dec 01 10:58:34 crc kubenswrapper[4909]: I1201 10:58:34.330247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" event={"ID":"53ed2809-6df5-479f-a447-57fec5cb16ca","Type":"ContainerStarted","Data":"5e7173a86a67ceb9fdc0d84ccd8ad34c06a1bb61b77eafe810fb178dc1d417bd"} Dec 01 10:58:35 crc kubenswrapper[4909]: I1201 10:58:35.340352 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" event={"ID":"53ed2809-6df5-479f-a447-57fec5cb16ca","Type":"ContainerStarted","Data":"9388fe405be7a5d013e385d5eaf520b74383e7f80d4108ad99c0b38e1df51d6e"} Dec 01 10:58:35 crc kubenswrapper[4909]: I1201 10:58:35.360289 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" podStartSLOduration=1.834085187 podStartE2EDuration="2.36026381s" podCreationTimestamp="2025-12-01 10:58:33 +0000 UTC" firstStartedPulling="2025-12-01 10:58:34.29748458 +0000 UTC m=+1631.531955468" lastFinishedPulling="2025-12-01 10:58:34.823663193 +0000 UTC m=+1632.058134091" observedRunningTime="2025-12-01 10:58:35.356410837 +0000 UTC m=+1632.590881735" watchObservedRunningTime="2025-12-01 10:58:35.36026381 +0000 UTC m=+1632.594734728" Dec 01 10:58:38 crc kubenswrapper[4909]: I1201 10:58:38.258042 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:58:38 crc kubenswrapper[4909]: E1201 10:58:38.259454 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.294690 4909 scope.go:117] "RemoveContainer" containerID="230179b68886a457a25985c3f4a52bfde6e833e8502b7cd22167b8ecea19df80" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.327587 4909 scope.go:117] "RemoveContainer" containerID="56371d0bb45f86268ad7522af8866f1d19b11fc7d8500014660c74b3dea75f46" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.367120 4909 scope.go:117] "RemoveContainer" containerID="d54eb1d388ec8d4fd2cb7eca3e3eea821fabdbee351d5546657d71b37b528015" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.425063 4909 scope.go:117] "RemoveContainer" containerID="a9fbadf42bb43c54520be45792690a305f564b69fbace35a4cc7701ebc44182f" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.480049 4909 scope.go:117] "RemoveContainer" containerID="c31a3a73c919ec3abf98051cc3c4a6f5527c01f74209a519b4f5e57054cc8109" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.541052 4909 scope.go:117] "RemoveContainer" containerID="ba3e371b99cd19fd004a535768c38eb3df40f1a5f5e3a546cdeab6fcfbebe989" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.561700 4909 scope.go:117] "RemoveContainer" containerID="2c9a8a3316e52561aa182934f04e3406c535bac7962bc6215d33f1aae52b2842" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.581590 4909 scope.go:117] "RemoveContainer" containerID="d84d45e370457c4913ada2e3cffde5a5fc7aa63070a81d6561afb65c04df13a6" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.600116 4909 scope.go:117] "RemoveContainer" containerID="1f91f3690f8e907ac02cdf05720ddd9d69a17399820d9ea258c556c4198a9585" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.619772 4909 scope.go:117] "RemoveContainer" containerID="e02bfc85d62a759463f53a75b25bb74cfaf679f7e4b51d3ee20476be4578288d" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.642692 4909 scope.go:117] "RemoveContainer" containerID="1eb1437d42668fff56e46dc63ac5a12147734fde7aacbb674871a966e6b5541c" Dec 01 10:58:40 crc kubenswrapper[4909]: I1201 10:58:40.660979 4909 scope.go:117] "RemoveContainer" containerID="061e842010ad862f56cb5f42076ec711792d6754d9348913fbd62ad54e101f0b" Dec 01 10:58:41 crc kubenswrapper[4909]: I1201 10:58:41.040132 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qqfxj"] Dec 01 10:58:41 crc kubenswrapper[4909]: I1201 10:58:41.050792 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qqfxj"] Dec 01 10:58:41 crc kubenswrapper[4909]: I1201 10:58:41.270023 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672cc9fe-2684-446f-8c16-59c03ee23678" path="/var/lib/kubelet/pods/672cc9fe-2684-446f-8c16-59c03ee23678/volumes" Dec 01 10:58:43 crc kubenswrapper[4909]: I1201 10:58:43.039833 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2jwcr"] Dec 01 10:58:43 crc kubenswrapper[4909]: I1201 10:58:43.054017 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2jwcr"] Dec 01 10:58:43 crc kubenswrapper[4909]: I1201 10:58:43.272346 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36ea017-d935-4011-8590-0aaa002229de" path="/var/lib/kubelet/pods/e36ea017-d935-4011-8590-0aaa002229de/volumes" Dec 01 10:58:47 crc kubenswrapper[4909]: I1201 10:58:47.057480 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9whz5"] Dec 01 10:58:47 crc kubenswrapper[4909]: I1201 10:58:47.065040 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9whz5"] Dec 01 10:58:47 crc kubenswrapper[4909]: I1201 10:58:47.270214 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbebc139-adb5-4785-8db0-283b85b33fda" path="/var/lib/kubelet/pods/cbebc139-adb5-4785-8db0-283b85b33fda/volumes" Dec 01 10:58:53 crc kubenswrapper[4909]: I1201 10:58:53.265280 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:58:53 crc kubenswrapper[4909]: E1201 10:58:53.266136 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:02 crc kubenswrapper[4909]: I1201 10:59:02.037491 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-975kr"] Dec 01 10:59:02 crc kubenswrapper[4909]: I1201 10:59:02.045448 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-975kr"] Dec 01 10:59:03 crc kubenswrapper[4909]: I1201 10:59:03.036781 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6nt8w"] Dec 01 10:59:03 crc kubenswrapper[4909]: I1201 10:59:03.046561 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6nt8w"] Dec 01 10:59:03 crc kubenswrapper[4909]: I1201 10:59:03.277992 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00cc5cc5-c22b-4523-96c5-baa507ad0ce1" path="/var/lib/kubelet/pods/00cc5cc5-c22b-4523-96c5-baa507ad0ce1/volumes" Dec 01 10:59:03 crc kubenswrapper[4909]: I1201 10:59:03.279201 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cd6cd1c-99b8-4639-828e-9585790b9d26" path="/var/lib/kubelet/pods/8cd6cd1c-99b8-4639-828e-9585790b9d26/volumes" Dec 01 10:59:05 crc kubenswrapper[4909]: I1201 10:59:05.257317 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:59:05 crc kubenswrapper[4909]: E1201 10:59:05.257794 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:16 crc kubenswrapper[4909]: I1201 10:59:16.256796 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:59:16 crc kubenswrapper[4909]: E1201 10:59:16.258225 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:25 crc kubenswrapper[4909]: I1201 10:59:25.811732 4909 generic.go:334] "Generic (PLEG): container finished" podID="53ed2809-6df5-479f-a447-57fec5cb16ca" containerID="9388fe405be7a5d013e385d5eaf520b74383e7f80d4108ad99c0b38e1df51d6e" exitCode=0 Dec 01 10:59:25 crc kubenswrapper[4909]: I1201 10:59:25.811798 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" event={"ID":"53ed2809-6df5-479f-a447-57fec5cb16ca","Type":"ContainerDied","Data":"9388fe405be7a5d013e385d5eaf520b74383e7f80d4108ad99c0b38e1df51d6e"} Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.257491 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:59:27 crc kubenswrapper[4909]: E1201 10:59:27.258178 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.283379 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.364859 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key\") pod \"53ed2809-6df5-479f-a447-57fec5cb16ca\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.364962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86xgb\" (UniqueName: \"kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb\") pod \"53ed2809-6df5-479f-a447-57fec5cb16ca\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.365140 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory\") pod \"53ed2809-6df5-479f-a447-57fec5cb16ca\" (UID: \"53ed2809-6df5-479f-a447-57fec5cb16ca\") " Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.391278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb" (OuterVolumeSpecName: "kube-api-access-86xgb") pod "53ed2809-6df5-479f-a447-57fec5cb16ca" (UID: "53ed2809-6df5-479f-a447-57fec5cb16ca"). InnerVolumeSpecName "kube-api-access-86xgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.397368 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory" (OuterVolumeSpecName: "inventory") pod "53ed2809-6df5-479f-a447-57fec5cb16ca" (UID: "53ed2809-6df5-479f-a447-57fec5cb16ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.398479 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53ed2809-6df5-479f-a447-57fec5cb16ca" (UID: "53ed2809-6df5-479f-a447-57fec5cb16ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.468621 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.468656 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53ed2809-6df5-479f-a447-57fec5cb16ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.468672 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86xgb\" (UniqueName: \"kubernetes.io/projected/53ed2809-6df5-479f-a447-57fec5cb16ca-kube-api-access-86xgb\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.830740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" event={"ID":"53ed2809-6df5-479f-a447-57fec5cb16ca","Type":"ContainerDied","Data":"5e7173a86a67ceb9fdc0d84ccd8ad34c06a1bb61b77eafe810fb178dc1d417bd"} Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.830781 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e7173a86a67ceb9fdc0d84ccd8ad34c06a1bb61b77eafe810fb178dc1d417bd" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.831166 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.922295 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2s8gf"] Dec 01 10:59:27 crc kubenswrapper[4909]: E1201 10:59:27.924047 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ed2809-6df5-479f-a447-57fec5cb16ca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.924073 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ed2809-6df5-479f-a447-57fec5cb16ca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.924555 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ed2809-6df5-479f-a447-57fec5cb16ca" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.928484 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.931531 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.935214 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.935520 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.935941 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:59:27 crc kubenswrapper[4909]: I1201 10:59:27.957113 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2s8gf"] Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.080454 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.080612 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfm2b\" (UniqueName: \"kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.080687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.182005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.182099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.182180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfm2b\" (UniqueName: \"kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.187685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.189507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.206185 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfm2b\" (UniqueName: \"kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b\") pod \"ssh-known-hosts-edpm-deployment-2s8gf\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.257673 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.768217 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2s8gf"] Dec 01 10:59:28 crc kubenswrapper[4909]: I1201 10:59:28.841339 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" event={"ID":"2305ea3d-8ee9-43ec-967b-c19b3088c24e","Type":"ContainerStarted","Data":"c6f09adb858371b3806d7443dba02c5478c6e152794eb4750261906741431087"} Dec 01 10:59:29 crc kubenswrapper[4909]: I1201 10:59:29.851382 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" event={"ID":"2305ea3d-8ee9-43ec-967b-c19b3088c24e","Type":"ContainerStarted","Data":"558f8db742b7478f3c474863faf382206ca28f5350ab0b35d73a2c6e226de0b3"} Dec 01 10:59:29 crc kubenswrapper[4909]: I1201 10:59:29.869833 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" podStartSLOduration=2.314115057 podStartE2EDuration="2.869812833s" podCreationTimestamp="2025-12-01 10:59:27 +0000 UTC" firstStartedPulling="2025-12-01 10:59:28.773447001 +0000 UTC m=+1686.007917899" lastFinishedPulling="2025-12-01 10:59:29.329144777 +0000 UTC m=+1686.563615675" observedRunningTime="2025-12-01 10:59:29.869075819 +0000 UTC m=+1687.103546717" watchObservedRunningTime="2025-12-01 10:59:29.869812833 +0000 UTC m=+1687.104283741" Dec 01 10:59:36 crc kubenswrapper[4909]: I1201 10:59:36.926715 4909 generic.go:334] "Generic (PLEG): container finished" podID="2305ea3d-8ee9-43ec-967b-c19b3088c24e" containerID="558f8db742b7478f3c474863faf382206ca28f5350ab0b35d73a2c6e226de0b3" exitCode=0 Dec 01 10:59:36 crc kubenswrapper[4909]: I1201 10:59:36.926797 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" event={"ID":"2305ea3d-8ee9-43ec-967b-c19b3088c24e","Type":"ContainerDied","Data":"558f8db742b7478f3c474863faf382206ca28f5350ab0b35d73a2c6e226de0b3"} Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.257686 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:59:38 crc kubenswrapper[4909]: E1201 10:59:38.258458 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.320264 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.378043 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfm2b\" (UniqueName: \"kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b\") pod \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.378177 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0\") pod \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.378342 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam\") pod \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\" (UID: \"2305ea3d-8ee9-43ec-967b-c19b3088c24e\") " Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.383832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b" (OuterVolumeSpecName: "kube-api-access-mfm2b") pod "2305ea3d-8ee9-43ec-967b-c19b3088c24e" (UID: "2305ea3d-8ee9-43ec-967b-c19b3088c24e"). InnerVolumeSpecName "kube-api-access-mfm2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.404630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2305ea3d-8ee9-43ec-967b-c19b3088c24e" (UID: "2305ea3d-8ee9-43ec-967b-c19b3088c24e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.407808 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2305ea3d-8ee9-43ec-967b-c19b3088c24e" (UID: "2305ea3d-8ee9-43ec-967b-c19b3088c24e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.480695 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfm2b\" (UniqueName: \"kubernetes.io/projected/2305ea3d-8ee9-43ec-967b-c19b3088c24e-kube-api-access-mfm2b\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.480732 4909 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.480745 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2305ea3d-8ee9-43ec-967b-c19b3088c24e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.948723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" event={"ID":"2305ea3d-8ee9-43ec-967b-c19b3088c24e","Type":"ContainerDied","Data":"c6f09adb858371b3806d7443dba02c5478c6e152794eb4750261906741431087"} Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.948766 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f09adb858371b3806d7443dba02c5478c6e152794eb4750261906741431087" Dec 01 10:59:38 crc kubenswrapper[4909]: I1201 10:59:38.948783 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2s8gf" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.018141 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh"] Dec 01 10:59:39 crc kubenswrapper[4909]: E1201 10:59:39.018769 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2305ea3d-8ee9-43ec-967b-c19b3088c24e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.018796 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2305ea3d-8ee9-43ec-967b-c19b3088c24e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.019091 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2305ea3d-8ee9-43ec-967b-c19b3088c24e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.019897 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.023348 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.024314 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.025043 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.025396 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.032156 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh"] Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.092172 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.092232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.092293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh28v\" (UniqueName: \"kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.193683 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.193750 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.193774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh28v\" (UniqueName: \"kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.197529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.197775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.210890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh28v\" (UniqueName: \"kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5lfnh\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.336413 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.849321 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh"] Dec 01 10:59:39 crc kubenswrapper[4909]: W1201 10:59:39.852523 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e095497_a8dc_43e9_b64b_68382cd859be.slice/crio-728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e WatchSource:0}: Error finding container 728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e: Status 404 returned error can't find the container with id 728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e Dec 01 10:59:39 crc kubenswrapper[4909]: I1201 10:59:39.957820 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" event={"ID":"2e095497-a8dc-43e9-b64b-68382cd859be","Type":"ContainerStarted","Data":"728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e"} Dec 01 10:59:40 crc kubenswrapper[4909]: I1201 10:59:40.865757 4909 scope.go:117] "RemoveContainer" containerID="79dd6ce6aac679be3bd5da57d44077b6af82cddf551a6466138d7285524d5ef3" Dec 01 10:59:40 crc kubenswrapper[4909]: I1201 10:59:40.898701 4909 scope.go:117] "RemoveContainer" containerID="8711a9b6a33a1ddee459a43a67208d9e582e2890e46890cd4d3ca3e677d3e22a" Dec 01 10:59:40 crc kubenswrapper[4909]: I1201 10:59:40.935845 4909 scope.go:117] "RemoveContainer" containerID="0eb9e3c4665fd1d6854ac6b37fbe4826ac4563d9e160e20e11d8b7c643d33471" Dec 01 10:59:41 crc kubenswrapper[4909]: I1201 10:59:41.007715 4909 scope.go:117] "RemoveContainer" containerID="b417a1c144ac4d4fa82c5157225f49b226a8eca42b3b220d12001f4ad0fe1e95" Dec 01 10:59:41 crc kubenswrapper[4909]: I1201 10:59:41.048670 4909 scope.go:117] "RemoveContainer" containerID="f14a8d94e57329faf4f94341d2e01775e8efdcfbbb3a7060d87d92fe0b787fa7" Dec 01 10:59:41 crc kubenswrapper[4909]: I1201 10:59:41.981558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" event={"ID":"2e095497-a8dc-43e9-b64b-68382cd859be","Type":"ContainerStarted","Data":"3e97f64bc055c7c00e582532c5a6db76a4a39df8226c08a08996370a7861e495"} Dec 01 10:59:42 crc kubenswrapper[4909]: I1201 10:59:42.017444 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" podStartSLOduration=2.8454963490000003 podStartE2EDuration="4.017423298s" podCreationTimestamp="2025-12-01 10:59:38 +0000 UTC" firstStartedPulling="2025-12-01 10:59:39.854715503 +0000 UTC m=+1697.089186401" lastFinishedPulling="2025-12-01 10:59:41.026642462 +0000 UTC m=+1698.261113350" observedRunningTime="2025-12-01 10:59:42.006664457 +0000 UTC m=+1699.241135365" watchObservedRunningTime="2025-12-01 10:59:42.017423298 +0000 UTC m=+1699.251894196" Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.063446 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-303c-account-create-update-8ff88"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.073127 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8nwnt"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.083643 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-85ccm"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.091623 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f543-account-create-update-qnsrz"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.098860 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nhslm"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.105550 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nhslm"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.112751 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-85ccm"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.123111 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c69c-account-create-update-w655r"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.133272 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f543-account-create-update-qnsrz"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.143501 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8nwnt"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.153820 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-303c-account-create-update-8ff88"] Dec 01 10:59:44 crc kubenswrapper[4909]: I1201 10:59:44.163328 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c69c-account-create-update-w655r"] Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.266847 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c829e27-1fd3-4d20-a4c8-080580ea341d" path="/var/lib/kubelet/pods/0c829e27-1fd3-4d20-a4c8-080580ea341d/volumes" Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.267833 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17519694-f981-49fe-8579-c037e0afd59a" path="/var/lib/kubelet/pods/17519694-f981-49fe-8579-c037e0afd59a/volumes" Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.268369 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ddd34b0-2faa-4d36-a8f6-485e2d05a71a" path="/var/lib/kubelet/pods/1ddd34b0-2faa-4d36-a8f6-485e2d05a71a/volumes" Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.268934 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a13de5-4b8b-4ec1-b6ab-de297ac31eea" path="/var/lib/kubelet/pods/39a13de5-4b8b-4ec1-b6ab-de297ac31eea/volumes" Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.270000 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8848ea-48fe-4b9c-9cd2-6935a6a28717" path="/var/lib/kubelet/pods/3d8848ea-48fe-4b9c-9cd2-6935a6a28717/volumes" Dec 01 10:59:45 crc kubenswrapper[4909]: I1201 10:59:45.270529 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15e2fbe-81d5-4d34-a5a4-e920de0c607e" path="/var/lib/kubelet/pods/a15e2fbe-81d5-4d34-a5a4-e920de0c607e/volumes" Dec 01 10:59:50 crc kubenswrapper[4909]: I1201 10:59:50.064122 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e095497-a8dc-43e9-b64b-68382cd859be" containerID="3e97f64bc055c7c00e582532c5a6db76a4a39df8226c08a08996370a7861e495" exitCode=0 Dec 01 10:59:50 crc kubenswrapper[4909]: I1201 10:59:50.064214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" event={"ID":"2e095497-a8dc-43e9-b64b-68382cd859be","Type":"ContainerDied","Data":"3e97f64bc055c7c00e582532c5a6db76a4a39df8226c08a08996370a7861e495"} Dec 01 10:59:50 crc kubenswrapper[4909]: I1201 10:59:50.257299 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 10:59:50 crc kubenswrapper[4909]: E1201 10:59:50.257565 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.462630 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.533503 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh28v\" (UniqueName: \"kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v\") pod \"2e095497-a8dc-43e9-b64b-68382cd859be\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.533634 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory\") pod \"2e095497-a8dc-43e9-b64b-68382cd859be\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.533912 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key\") pod \"2e095497-a8dc-43e9-b64b-68382cd859be\" (UID: \"2e095497-a8dc-43e9-b64b-68382cd859be\") " Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.539978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v" (OuterVolumeSpecName: "kube-api-access-xh28v") pod "2e095497-a8dc-43e9-b64b-68382cd859be" (UID: "2e095497-a8dc-43e9-b64b-68382cd859be"). InnerVolumeSpecName "kube-api-access-xh28v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.560312 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory" (OuterVolumeSpecName: "inventory") pod "2e095497-a8dc-43e9-b64b-68382cd859be" (UID: "2e095497-a8dc-43e9-b64b-68382cd859be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.560365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e095497-a8dc-43e9-b64b-68382cd859be" (UID: "2e095497-a8dc-43e9-b64b-68382cd859be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.636131 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.636165 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh28v\" (UniqueName: \"kubernetes.io/projected/2e095497-a8dc-43e9-b64b-68382cd859be-kube-api-access-xh28v\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:51 crc kubenswrapper[4909]: I1201 10:59:51.636175 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e095497-a8dc-43e9-b64b-68382cd859be-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.085075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" event={"ID":"2e095497-a8dc-43e9-b64b-68382cd859be","Type":"ContainerDied","Data":"728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e"} Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.085113 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="728fba8df9011f8829a3fd005e2f250bc8eaff22bda1b8670435b32852026a3e" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.085173 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.156241 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd"] Dec 01 10:59:52 crc kubenswrapper[4909]: E1201 10:59:52.156711 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e095497-a8dc-43e9-b64b-68382cd859be" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.156728 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e095497-a8dc-43e9-b64b-68382cd859be" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.156959 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e095497-a8dc-43e9-b64b-68382cd859be" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.157693 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.162427 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.162614 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.162700 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.162718 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.167797 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd"] Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.248226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.248461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.248588 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcq6w\" (UniqueName: \"kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.350663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.350763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcq6w\" (UniqueName: \"kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.350979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.354683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.356553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.371322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcq6w\" (UniqueName: \"kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:52 crc kubenswrapper[4909]: I1201 10:59:52.474640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 10:59:53 crc kubenswrapper[4909]: I1201 10:59:53.001444 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd"] Dec 01 10:59:53 crc kubenswrapper[4909]: I1201 10:59:53.093561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" event={"ID":"6405a27a-4265-4699-af98-e6dd88fb1cb1","Type":"ContainerStarted","Data":"23981f8ff26c82d7d096471b130937dbfd2cfbe1d3d93f7fc0744985e70b6890"} Dec 01 10:59:54 crc kubenswrapper[4909]: I1201 10:59:54.130853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" event={"ID":"6405a27a-4265-4699-af98-e6dd88fb1cb1","Type":"ContainerStarted","Data":"bc059a122b879bbbdfbacd723b14a28978f5ed4be95062204943e238a5282049"} Dec 01 10:59:54 crc kubenswrapper[4909]: I1201 10:59:54.157241 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" podStartSLOduration=1.6750462590000001 podStartE2EDuration="2.157221364s" podCreationTimestamp="2025-12-01 10:59:52 +0000 UTC" firstStartedPulling="2025-12-01 10:59:53.005603179 +0000 UTC m=+1710.240074067" lastFinishedPulling="2025-12-01 10:59:53.487778274 +0000 UTC m=+1710.722249172" observedRunningTime="2025-12-01 10:59:54.150367077 +0000 UTC m=+1711.384837985" watchObservedRunningTime="2025-12-01 10:59:54.157221364 +0000 UTC m=+1711.391692272" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.152764 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w"] Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.156366 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.159397 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.159685 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.181626 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w"] Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.221664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.221771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.221994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n9wk\" (UniqueName: \"kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.324530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.324604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.324654 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n9wk\" (UniqueName: \"kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.326054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.333518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.346543 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n9wk\" (UniqueName: \"kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk\") pod \"collect-profiles-29409780-9cj8w\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:00 crc kubenswrapper[4909]: I1201 11:00:00.486580 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:01 crc kubenswrapper[4909]: I1201 11:00:01.596707 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w"] Dec 01 11:00:02 crc kubenswrapper[4909]: I1201 11:00:02.230292 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d3bf408-451e-4396-91f6-6340297bacf9" containerID="d734f22d7d0be1b25d612f481cffce815cae07d59dcb369a78bffd7e78d285fb" exitCode=0 Dec 01 11:00:02 crc kubenswrapper[4909]: I1201 11:00:02.230445 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" event={"ID":"5d3bf408-451e-4396-91f6-6340297bacf9","Type":"ContainerDied","Data":"d734f22d7d0be1b25d612f481cffce815cae07d59dcb369a78bffd7e78d285fb"} Dec 01 11:00:02 crc kubenswrapper[4909]: I1201 11:00:02.230617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" event={"ID":"5d3bf408-451e-4396-91f6-6340297bacf9","Type":"ContainerStarted","Data":"0b68d2bef0822deeb3c3f34ccf8d06fbe6a360aa3ff6dc028e4b52bc191caa87"} Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.245084 4909 generic.go:334] "Generic (PLEG): container finished" podID="6405a27a-4265-4699-af98-e6dd88fb1cb1" containerID="bc059a122b879bbbdfbacd723b14a28978f5ed4be95062204943e238a5282049" exitCode=0 Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.245264 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" event={"ID":"6405a27a-4265-4699-af98-e6dd88fb1cb1","Type":"ContainerDied","Data":"bc059a122b879bbbdfbacd723b14a28978f5ed4be95062204943e238a5282049"} Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.582082 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.699296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n9wk\" (UniqueName: \"kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk\") pod \"5d3bf408-451e-4396-91f6-6340297bacf9\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.699368 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume\") pod \"5d3bf408-451e-4396-91f6-6340297bacf9\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.699574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume\") pod \"5d3bf408-451e-4396-91f6-6340297bacf9\" (UID: \"5d3bf408-451e-4396-91f6-6340297bacf9\") " Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.700673 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d3bf408-451e-4396-91f6-6340297bacf9" (UID: "5d3bf408-451e-4396-91f6-6340297bacf9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.705781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5d3bf408-451e-4396-91f6-6340297bacf9" (UID: "5d3bf408-451e-4396-91f6-6340297bacf9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.707238 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk" (OuterVolumeSpecName: "kube-api-access-5n9wk") pod "5d3bf408-451e-4396-91f6-6340297bacf9" (UID: "5d3bf408-451e-4396-91f6-6340297bacf9"). InnerVolumeSpecName "kube-api-access-5n9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.801589 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n9wk\" (UniqueName: \"kubernetes.io/projected/5d3bf408-451e-4396-91f6-6340297bacf9-kube-api-access-5n9wk\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.801936 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d3bf408-451e-4396-91f6-6340297bacf9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4909]: I1201 11:00:03.801952 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5d3bf408-451e-4396-91f6-6340297bacf9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.256385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" event={"ID":"5d3bf408-451e-4396-91f6-6340297bacf9","Type":"ContainerDied","Data":"0b68d2bef0822deeb3c3f34ccf8d06fbe6a360aa3ff6dc028e4b52bc191caa87"} Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.256461 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b68d2bef0822deeb3c3f34ccf8d06fbe6a360aa3ff6dc028e4b52bc191caa87" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.256565 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.783257 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.925280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory\") pod \"6405a27a-4265-4699-af98-e6dd88fb1cb1\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.925590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcq6w\" (UniqueName: \"kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w\") pod \"6405a27a-4265-4699-af98-e6dd88fb1cb1\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.925721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key\") pod \"6405a27a-4265-4699-af98-e6dd88fb1cb1\" (UID: \"6405a27a-4265-4699-af98-e6dd88fb1cb1\") " Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.936436 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w" (OuterVolumeSpecName: "kube-api-access-kcq6w") pod "6405a27a-4265-4699-af98-e6dd88fb1cb1" (UID: "6405a27a-4265-4699-af98-e6dd88fb1cb1"). InnerVolumeSpecName "kube-api-access-kcq6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.958428 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6405a27a-4265-4699-af98-e6dd88fb1cb1" (UID: "6405a27a-4265-4699-af98-e6dd88fb1cb1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:00:04 crc kubenswrapper[4909]: I1201 11:00:04.971194 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory" (OuterVolumeSpecName: "inventory") pod "6405a27a-4265-4699-af98-e6dd88fb1cb1" (UID: "6405a27a-4265-4699-af98-e6dd88fb1cb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.028837 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcq6w\" (UniqueName: \"kubernetes.io/projected/6405a27a-4265-4699-af98-e6dd88fb1cb1-kube-api-access-kcq6w\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.028913 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.028930 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6405a27a-4265-4699-af98-e6dd88fb1cb1-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.259995 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:00:05 crc kubenswrapper[4909]: E1201 11:00:05.262647 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.267966 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.269900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd" event={"ID":"6405a27a-4265-4699-af98-e6dd88fb1cb1","Type":"ContainerDied","Data":"23981f8ff26c82d7d096471b130937dbfd2cfbe1d3d93f7fc0744985e70b6890"} Dec 01 11:00:05 crc kubenswrapper[4909]: I1201 11:00:05.269948 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23981f8ff26c82d7d096471b130937dbfd2cfbe1d3d93f7fc0744985e70b6890" Dec 01 11:00:16 crc kubenswrapper[4909]: I1201 11:00:16.043350 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s699j"] Dec 01 11:00:16 crc kubenswrapper[4909]: I1201 11:00:16.052897 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s699j"] Dec 01 11:00:16 crc kubenswrapper[4909]: I1201 11:00:16.257326 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:00:16 crc kubenswrapper[4909]: E1201 11:00:16.257817 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:00:17 crc kubenswrapper[4909]: I1201 11:00:17.267382 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88326ebb-a888-4633-9459-114d0e1f2cc9" path="/var/lib/kubelet/pods/88326ebb-a888-4633-9459-114d0e1f2cc9/volumes" Dec 01 11:00:30 crc kubenswrapper[4909]: I1201 11:00:30.257888 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:00:30 crc kubenswrapper[4909]: E1201 11:00:30.259121 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:00:40 crc kubenswrapper[4909]: I1201 11:00:40.039252 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9whwl"] Dec 01 11:00:40 crc kubenswrapper[4909]: I1201 11:00:40.050504 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bqghh"] Dec 01 11:00:40 crc kubenswrapper[4909]: I1201 11:00:40.058609 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bqghh"] Dec 01 11:00:40 crc kubenswrapper[4909]: I1201 11:00:40.067238 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9whwl"] Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.233013 4909 scope.go:117] "RemoveContainer" containerID="fd8f0f37f38583ab1dd43125ff22588bf9286608d390787c80430153ee02f484" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.256637 4909 scope.go:117] "RemoveContainer" containerID="9d36db06fa89f96e9752917fb917370a1f9fa4d5705c48b69a01a28e1838ff13" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.268160 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4556e9cd-36ab-411b-9065-0b74a4c426a5" path="/var/lib/kubelet/pods/4556e9cd-36ab-411b-9065-0b74a4c426a5/volumes" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.268821 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ddd315c-8e96-40fd-ba04-48bf855ac533" path="/var/lib/kubelet/pods/8ddd315c-8e96-40fd-ba04-48bf855ac533/volumes" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.299623 4909 scope.go:117] "RemoveContainer" containerID="93ed8cc7aeae497a62c813e5c64ec92413b7297909aadf7dde1d6e0ab7250fbe" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.343805 4909 scope.go:117] "RemoveContainer" containerID="1b1c081ad011b830f52069e6d7413ec089c27f035353d85d6fab421846a252ee" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.396425 4909 scope.go:117] "RemoveContainer" containerID="5a2c50bac4437dcfa63d5d55e1fc7633aee69a8e30b09a902cbda218e65ee808" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.432957 4909 scope.go:117] "RemoveContainer" containerID="5e5a91b7b7a48182688c61e0c7011c5391ffcd15791ae48afb0a4546fba6616f" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.498787 4909 scope.go:117] "RemoveContainer" containerID="fa83846e3bbdcf2223b2a1232cf6495e7cb0f22e17bac29575877582115d6dfd" Dec 01 11:00:41 crc kubenswrapper[4909]: I1201 11:00:41.526046 4909 scope.go:117] "RemoveContainer" containerID="3ada28d3082887f0c23f4cabfbfbb741cea5332095b300be69dcd94609ddd8fc" Dec 01 11:00:43 crc kubenswrapper[4909]: I1201 11:00:43.262666 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:00:43 crc kubenswrapper[4909]: E1201 11:00:43.263290 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:00:54 crc kubenswrapper[4909]: I1201 11:00:54.256789 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:00:54 crc kubenswrapper[4909]: E1201 11:00:54.257435 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.156713 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409781-m7z6j"] Dec 01 11:01:00 crc kubenswrapper[4909]: E1201 11:01:00.157896 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6405a27a-4265-4699-af98-e6dd88fb1cb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.157918 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6405a27a-4265-4699-af98-e6dd88fb1cb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:01:00 crc kubenswrapper[4909]: E1201 11:01:00.157966 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3bf408-451e-4396-91f6-6340297bacf9" containerName="collect-profiles" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.157975 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3bf408-451e-4396-91f6-6340297bacf9" containerName="collect-profiles" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.158176 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6405a27a-4265-4699-af98-e6dd88fb1cb1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.158194 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3bf408-451e-4396-91f6-6340297bacf9" containerName="collect-profiles" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.159214 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.166526 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409781-m7z6j"] Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.211932 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.212023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gzlq\" (UniqueName: \"kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.212079 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.212130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.313770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.313918 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.314033 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.314095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gzlq\" (UniqueName: \"kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.320334 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.321548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.321568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.331650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gzlq\" (UniqueName: \"kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq\") pod \"keystone-cron-29409781-m7z6j\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.482631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:00 crc kubenswrapper[4909]: I1201 11:01:00.938759 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409781-m7z6j"] Dec 01 11:01:01 crc kubenswrapper[4909]: I1201 11:01:01.843111 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409781-m7z6j" event={"ID":"264e1bcf-af02-4231-9f5e-84f4bae0db08","Type":"ContainerStarted","Data":"0fad6994df1d09d42283a955df661e942500871b4ea2ffc4cffb96e9ad7d5e88"} Dec 01 11:01:01 crc kubenswrapper[4909]: I1201 11:01:01.843555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409781-m7z6j" event={"ID":"264e1bcf-af02-4231-9f5e-84f4bae0db08","Type":"ContainerStarted","Data":"6f9f1fd384ef41f61761ebff333006487b660247d5dd5ce6874ef3b32adcf6e9"} Dec 01 11:01:01 crc kubenswrapper[4909]: I1201 11:01:01.869078 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409781-m7z6j" podStartSLOduration=1.8690587779999999 podStartE2EDuration="1.869058778s" podCreationTimestamp="2025-12-01 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:01:01.860815927 +0000 UTC m=+1779.095286835" watchObservedRunningTime="2025-12-01 11:01:01.869058778 +0000 UTC m=+1779.103529676" Dec 01 11:01:03 crc kubenswrapper[4909]: I1201 11:01:03.859503 4909 generic.go:334] "Generic (PLEG): container finished" podID="264e1bcf-af02-4231-9f5e-84f4bae0db08" containerID="0fad6994df1d09d42283a955df661e942500871b4ea2ffc4cffb96e9ad7d5e88" exitCode=0 Dec 01 11:01:03 crc kubenswrapper[4909]: I1201 11:01:03.859594 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409781-m7z6j" event={"ID":"264e1bcf-af02-4231-9f5e-84f4bae0db08","Type":"ContainerDied","Data":"0fad6994df1d09d42283a955df661e942500871b4ea2ffc4cffb96e9ad7d5e88"} Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.186321 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.308408 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle\") pod \"264e1bcf-af02-4231-9f5e-84f4bae0db08\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.308480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data\") pod \"264e1bcf-af02-4231-9f5e-84f4bae0db08\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.308533 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys\") pod \"264e1bcf-af02-4231-9f5e-84f4bae0db08\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.308760 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gzlq\" (UniqueName: \"kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq\") pod \"264e1bcf-af02-4231-9f5e-84f4bae0db08\" (UID: \"264e1bcf-af02-4231-9f5e-84f4bae0db08\") " Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.315308 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "264e1bcf-af02-4231-9f5e-84f4bae0db08" (UID: "264e1bcf-af02-4231-9f5e-84f4bae0db08"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.329099 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq" (OuterVolumeSpecName: "kube-api-access-4gzlq") pod "264e1bcf-af02-4231-9f5e-84f4bae0db08" (UID: "264e1bcf-af02-4231-9f5e-84f4bae0db08"). InnerVolumeSpecName "kube-api-access-4gzlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.339623 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "264e1bcf-af02-4231-9f5e-84f4bae0db08" (UID: "264e1bcf-af02-4231-9f5e-84f4bae0db08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.364939 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data" (OuterVolumeSpecName: "config-data") pod "264e1bcf-af02-4231-9f5e-84f4bae0db08" (UID: "264e1bcf-af02-4231-9f5e-84f4bae0db08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.412475 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gzlq\" (UniqueName: \"kubernetes.io/projected/264e1bcf-af02-4231-9f5e-84f4bae0db08-kube-api-access-4gzlq\") on node \"crc\" DevicePath \"\"" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.412519 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.412530 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.412541 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/264e1bcf-af02-4231-9f5e-84f4bae0db08-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.879315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409781-m7z6j" event={"ID":"264e1bcf-af02-4231-9f5e-84f4bae0db08","Type":"ContainerDied","Data":"6f9f1fd384ef41f61761ebff333006487b660247d5dd5ce6874ef3b32adcf6e9"} Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.879364 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f9f1fd384ef41f61761ebff333006487b660247d5dd5ce6874ef3b32adcf6e9" Dec 01 11:01:05 crc kubenswrapper[4909]: I1201 11:01:05.879391 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409781-m7z6j" Dec 01 11:01:09 crc kubenswrapper[4909]: I1201 11:01:09.258472 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:01:09 crc kubenswrapper[4909]: E1201 11:01:09.261328 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:01:21 crc kubenswrapper[4909]: I1201 11:01:21.257391 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:01:21 crc kubenswrapper[4909]: E1201 11:01:21.259281 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:01:25 crc kubenswrapper[4909]: I1201 11:01:25.046997 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8n7t"] Dec 01 11:01:25 crc kubenswrapper[4909]: I1201 11:01:25.056396 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q8n7t"] Dec 01 11:01:25 crc kubenswrapper[4909]: I1201 11:01:25.267168 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99316486-3c01-47ab-8924-cee49da4e1b4" path="/var/lib/kubelet/pods/99316486-3c01-47ab-8924-cee49da4e1b4/volumes" Dec 01 11:01:35 crc kubenswrapper[4909]: I1201 11:01:35.257303 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:01:35 crc kubenswrapper[4909]: E1201 11:01:35.258041 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:01:41 crc kubenswrapper[4909]: I1201 11:01:41.705988 4909 scope.go:117] "RemoveContainer" containerID="cf93c045bda5c27f82b5e9a4fa58e99c37c3561a467b54813269308f17bfdf9c" Dec 01 11:01:41 crc kubenswrapper[4909]: I1201 11:01:41.758416 4909 scope.go:117] "RemoveContainer" containerID="bd9dd739f1153b614240161421f81051fcb6fb70d83f3b38f8553f249952f701" Dec 01 11:01:50 crc kubenswrapper[4909]: I1201 11:01:50.257191 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:01:51 crc kubenswrapper[4909]: I1201 11:01:51.267315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c"} Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.485234 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.495285 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.505956 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8vx8w"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.513940 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zkr4k"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.521175 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.528297 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hv2g2"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.535110 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.542259 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ltpq6"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.549201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2s8gf"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.556152 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.563050 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2s8gf"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.570144 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2xqkd"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.577958 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.585123 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.592138 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.600561 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.606982 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ddnzn"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.613350 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dqq4d"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.619404 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hh9bd"] Dec 01 11:03:14 crc kubenswrapper[4909]: I1201 11:03:14.625866 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5lfnh"] Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.266301 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c812b5-85e1-4e57-bbf8-139a0a4e71d0" path="/var/lib/kubelet/pods/07c812b5-85e1-4e57-bbf8-139a0a4e71d0/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.266894 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2305ea3d-8ee9-43ec-967b-c19b3088c24e" path="/var/lib/kubelet/pods/2305ea3d-8ee9-43ec-967b-c19b3088c24e/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.267437 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e095497-a8dc-43e9-b64b-68382cd859be" path="/var/lib/kubelet/pods/2e095497-a8dc-43e9-b64b-68382cd859be/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.268112 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ed2809-6df5-479f-a447-57fec5cb16ca" path="/var/lib/kubelet/pods/53ed2809-6df5-479f-a447-57fec5cb16ca/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.269738 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d8cc9d-aaa3-4244-9cd2-c384293f0328" path="/var/lib/kubelet/pods/60d8cc9d-aaa3-4244-9cd2-c384293f0328/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.270350 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6405a27a-4265-4699-af98-e6dd88fb1cb1" path="/var/lib/kubelet/pods/6405a27a-4265-4699-af98-e6dd88fb1cb1/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.270959 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c539a3fa-4b2a-4a11-91ab-9996e6c0c99d" path="/var/lib/kubelet/pods/c539a3fa-4b2a-4a11-91ab-9996e6c0c99d/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.271519 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc5c9d4-b897-448f-b8eb-6e16dc970df9" path="/var/lib/kubelet/pods/cbc5c9d4-b897-448f-b8eb-6e16dc970df9/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.272235 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac9f2ce-c178-44e9-918e-cf0eacbcc7b1" path="/var/lib/kubelet/pods/eac9f2ce-c178-44e9-918e-cf0eacbcc7b1/volumes" Dec 01 11:03:15 crc kubenswrapper[4909]: I1201 11:03:15.272903 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce05ed6-3691-4184-9a55-9dfac30486cf" path="/var/lib/kubelet/pods/fce05ed6-3691-4184-9a55-9dfac30486cf/volumes" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.399965 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s"] Dec 01 11:03:20 crc kubenswrapper[4909]: E1201 11:03:20.400726 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264e1bcf-af02-4231-9f5e-84f4bae0db08" containerName="keystone-cron" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.400744 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="264e1bcf-af02-4231-9f5e-84f4bae0db08" containerName="keystone-cron" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.400945 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="264e1bcf-af02-4231-9f5e-84f4bae0db08" containerName="keystone-cron" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.402598 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.423178 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.423368 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.423560 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.423703 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.424166 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.440194 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s"] Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.527253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7rm\" (UniqueName: \"kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.527377 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.527422 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.527447 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.527493 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.628810 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.628899 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7rm\" (UniqueName: \"kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.628995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.629054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.629094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.635174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.635205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.635591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.639816 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.650010 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7rm\" (UniqueName: \"kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s954s\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:20 crc kubenswrapper[4909]: I1201 11:03:20.753620 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:21 crc kubenswrapper[4909]: I1201 11:03:21.246554 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s"] Dec 01 11:03:21 crc kubenswrapper[4909]: I1201 11:03:21.250830 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:03:22 crc kubenswrapper[4909]: I1201 11:03:22.051484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" event={"ID":"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5","Type":"ContainerStarted","Data":"f759f8ebc583513c4b7151b83d3455865eb8657909532e9a0a1d6ca6f9310884"} Dec 01 11:03:23 crc kubenswrapper[4909]: I1201 11:03:23.064860 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" event={"ID":"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5","Type":"ContainerStarted","Data":"0447fc5a702142e9df53f304139d49174c3b085f4090d400a035a16baac186cb"} Dec 01 11:03:23 crc kubenswrapper[4909]: I1201 11:03:23.090271 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" podStartSLOduration=1.709460826 podStartE2EDuration="3.090248646s" podCreationTimestamp="2025-12-01 11:03:20 +0000 UTC" firstStartedPulling="2025-12-01 11:03:21.250580621 +0000 UTC m=+1918.485051519" lastFinishedPulling="2025-12-01 11:03:22.631368441 +0000 UTC m=+1919.865839339" observedRunningTime="2025-12-01 11:03:23.078987059 +0000 UTC m=+1920.313457977" watchObservedRunningTime="2025-12-01 11:03:23.090248646 +0000 UTC m=+1920.324719554" Dec 01 11:03:34 crc kubenswrapper[4909]: I1201 11:03:34.185525 4909 generic.go:334] "Generic (PLEG): container finished" podID="b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" containerID="0447fc5a702142e9df53f304139d49174c3b085f4090d400a035a16baac186cb" exitCode=0 Dec 01 11:03:34 crc kubenswrapper[4909]: I1201 11:03:34.187001 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" event={"ID":"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5","Type":"ContainerDied","Data":"0447fc5a702142e9df53f304139d49174c3b085f4090d400a035a16baac186cb"} Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.603489 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.771704 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory\") pod \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.772172 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key\") pod \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.772232 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk7rm\" (UniqueName: \"kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm\") pod \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.772362 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle\") pod \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.772441 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph\") pod \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\" (UID: \"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5\") " Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.779351 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph" (OuterVolumeSpecName: "ceph") pod "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" (UID: "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.781274 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm" (OuterVolumeSpecName: "kube-api-access-nk7rm") pod "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" (UID: "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5"). InnerVolumeSpecName "kube-api-access-nk7rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.783978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" (UID: "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.804621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory" (OuterVolumeSpecName: "inventory") pod "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" (UID: "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.809435 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" (UID: "b6b42ea5-6854-49d3-82bb-d7559fc8e9d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.874511 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.874727 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.874744 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.874754 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:35 crc kubenswrapper[4909]: I1201 11:03:35.874762 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk7rm\" (UniqueName: \"kubernetes.io/projected/b6b42ea5-6854-49d3-82bb-d7559fc8e9d5-kube-api-access-nk7rm\") on node \"crc\" DevicePath \"\"" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.212732 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" event={"ID":"b6b42ea5-6854-49d3-82bb-d7559fc8e9d5","Type":"ContainerDied","Data":"f759f8ebc583513c4b7151b83d3455865eb8657909532e9a0a1d6ca6f9310884"} Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.212776 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f759f8ebc583513c4b7151b83d3455865eb8657909532e9a0a1d6ca6f9310884" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.212816 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s954s" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.303069 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f"] Dec 01 11:03:36 crc kubenswrapper[4909]: E1201 11:03:36.303574 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.303593 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.303779 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b42ea5-6854-49d3-82bb-d7559fc8e9d5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.304433 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.306491 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.308462 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.308462 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.308611 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.308723 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.312866 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f"] Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.386566 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.386678 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfbk\" (UniqueName: \"kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.386721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.386741 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.386798 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.488688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.488780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.488849 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drfbk\" (UniqueName: \"kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.488902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.488921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.493637 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.494470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.499020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.500258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.504819 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfbk\" (UniqueName: \"kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:36 crc kubenswrapper[4909]: I1201 11:03:36.640393 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:03:37 crc kubenswrapper[4909]: I1201 11:03:37.150351 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f"] Dec 01 11:03:37 crc kubenswrapper[4909]: I1201 11:03:37.222627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" event={"ID":"4e2a41d6-d2aa-4db3-828a-e66538f66af0","Type":"ContainerStarted","Data":"760859faf23999847695545e458be7a59dcbf4a0b8a1a737468a37ef994bc767"} Dec 01 11:03:38 crc kubenswrapper[4909]: I1201 11:03:38.234063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" event={"ID":"4e2a41d6-d2aa-4db3-828a-e66538f66af0","Type":"ContainerStarted","Data":"8e626fd36cd4b1e8a0abd354e8987cf640725bee6d49c83dab586583cb818437"} Dec 01 11:03:38 crc kubenswrapper[4909]: I1201 11:03:38.255855 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" podStartSLOduration=1.805868142 podStartE2EDuration="2.255832745s" podCreationTimestamp="2025-12-01 11:03:36 +0000 UTC" firstStartedPulling="2025-12-01 11:03:37.167560429 +0000 UTC m=+1934.402031327" lastFinishedPulling="2025-12-01 11:03:37.617525032 +0000 UTC m=+1934.851995930" observedRunningTime="2025-12-01 11:03:38.249966279 +0000 UTC m=+1935.484437187" watchObservedRunningTime="2025-12-01 11:03:38.255832745 +0000 UTC m=+1935.490303643" Dec 01 11:03:41 crc kubenswrapper[4909]: I1201 11:03:41.862098 4909 scope.go:117] "RemoveContainer" containerID="43fb82b9f45c471b97e34fa64c47077ef31b6ecfb74d5a31e2c7fe0562b1116b" Dec 01 11:03:41 crc kubenswrapper[4909]: I1201 11:03:41.897594 4909 scope.go:117] "RemoveContainer" containerID="e453b33804409c9bbb58cfcd51f05c03dee6ee5e4aca100872a7861129b4b162" Dec 01 11:03:41 crc kubenswrapper[4909]: I1201 11:03:41.949110 4909 scope.go:117] "RemoveContainer" containerID="c42a594113560ab1346b41acbaddb119f4b00f5b16804eac70747d795260e7a4" Dec 01 11:03:42 crc kubenswrapper[4909]: I1201 11:03:42.046546 4909 scope.go:117] "RemoveContainer" containerID="d16a6b80c9b1c456d78308d24b546609cb93201bbbf726c3b7d1ca05e6f763cc" Dec 01 11:04:06 crc kubenswrapper[4909]: I1201 11:04:06.193651 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:04:06 crc kubenswrapper[4909]: I1201 11:04:06.194476 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:04:36 crc kubenswrapper[4909]: I1201 11:04:36.194220 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:04:36 crc kubenswrapper[4909]: I1201 11:04:36.196049 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:04:42 crc kubenswrapper[4909]: I1201 11:04:42.183100 4909 scope.go:117] "RemoveContainer" containerID="ae6117ff143fefd8aa362d187bc33f690629319fb96976a1c1db39090f4e2fb2" Dec 01 11:04:42 crc kubenswrapper[4909]: I1201 11:04:42.217559 4909 scope.go:117] "RemoveContainer" containerID="a1725f9c2d3e35ac72dde2cc691b22aef77a2f3c6f49c86d579cf5f58ffec315" Dec 01 11:04:42 crc kubenswrapper[4909]: I1201 11:04:42.292591 4909 scope.go:117] "RemoveContainer" containerID="9388fe405be7a5d013e385d5eaf520b74383e7f80d4108ad99c0b38e1df51d6e" Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.194283 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.194850 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.194921 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.195717 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.195786 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c" gracePeriod=600 Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.971213 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c" exitCode=0 Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.971274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c"} Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.971749 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08"} Dec 01 11:05:06 crc kubenswrapper[4909]: I1201 11:05:06.971769 4909 scope.go:117] "RemoveContainer" containerID="984b9017931594f4d079599fc4bbaf3ac14641cecb9d165104859f2ab88f20fb" Dec 01 11:05:13 crc kubenswrapper[4909]: I1201 11:05:13.045491 4909 generic.go:334] "Generic (PLEG): container finished" podID="4e2a41d6-d2aa-4db3-828a-e66538f66af0" containerID="8e626fd36cd4b1e8a0abd354e8987cf640725bee6d49c83dab586583cb818437" exitCode=0 Dec 01 11:05:13 crc kubenswrapper[4909]: I1201 11:05:13.045634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" event={"ID":"4e2a41d6-d2aa-4db3-828a-e66538f66af0","Type":"ContainerDied","Data":"8e626fd36cd4b1e8a0abd354e8987cf640725bee6d49c83dab586583cb818437"} Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.462155 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.536482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph\") pod \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.536570 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key\") pod \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.536735 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory\") pod \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.536760 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle\") pod \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.536801 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drfbk\" (UniqueName: \"kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk\") pod \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\" (UID: \"4e2a41d6-d2aa-4db3-828a-e66538f66af0\") " Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.544469 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph" (OuterVolumeSpecName: "ceph") pod "4e2a41d6-d2aa-4db3-828a-e66538f66af0" (UID: "4e2a41d6-d2aa-4db3-828a-e66538f66af0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.544539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk" (OuterVolumeSpecName: "kube-api-access-drfbk") pod "4e2a41d6-d2aa-4db3-828a-e66538f66af0" (UID: "4e2a41d6-d2aa-4db3-828a-e66538f66af0"). InnerVolumeSpecName "kube-api-access-drfbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.545010 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4e2a41d6-d2aa-4db3-828a-e66538f66af0" (UID: "4e2a41d6-d2aa-4db3-828a-e66538f66af0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.568210 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e2a41d6-d2aa-4db3-828a-e66538f66af0" (UID: "4e2a41d6-d2aa-4db3-828a-e66538f66af0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.569217 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory" (OuterVolumeSpecName: "inventory") pod "4e2a41d6-d2aa-4db3-828a-e66538f66af0" (UID: "4e2a41d6-d2aa-4db3-828a-e66538f66af0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.639284 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.639335 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.639353 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.639367 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drfbk\" (UniqueName: \"kubernetes.io/projected/4e2a41d6-d2aa-4db3-828a-e66538f66af0-kube-api-access-drfbk\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:14 crc kubenswrapper[4909]: I1201 11:05:14.639376 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e2a41d6-d2aa-4db3-828a-e66538f66af0-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.064773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" event={"ID":"4e2a41d6-d2aa-4db3-828a-e66538f66af0","Type":"ContainerDied","Data":"760859faf23999847695545e458be7a59dcbf4a0b8a1a737468a37ef994bc767"} Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.064822 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760859faf23999847695545e458be7a59dcbf4a0b8a1a737468a37ef994bc767" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.064863 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.170348 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw"] Dec 01 11:05:15 crc kubenswrapper[4909]: E1201 11:05:15.170865 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2a41d6-d2aa-4db3-828a-e66538f66af0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.170914 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2a41d6-d2aa-4db3-828a-e66538f66af0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.171176 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2a41d6-d2aa-4db3-828a-e66538f66af0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.172163 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.181928 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw"] Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.206713 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.207030 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.207259 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.209485 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.209752 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.252666 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.253036 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.253351 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.253454 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxsm\" (UniqueName: \"kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.356317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.356401 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxsm\" (UniqueName: \"kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.356525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.356618 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.361247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.361258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.382652 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.398809 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxsm\" (UniqueName: \"kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-562bw\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:15 crc kubenswrapper[4909]: I1201 11:05:15.511746 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:16 crc kubenswrapper[4909]: I1201 11:05:16.056649 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw"] Dec 01 11:05:16 crc kubenswrapper[4909]: I1201 11:05:16.076024 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" event={"ID":"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8","Type":"ContainerStarted","Data":"a629eb6054f4a21d5ca33abe83310e03ac8a2e5435375a65e620c16c0fe17e51"} Dec 01 11:05:18 crc kubenswrapper[4909]: I1201 11:05:18.094625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" event={"ID":"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8","Type":"ContainerStarted","Data":"a770cd804b600fe7a25fe3ecdeb26e0966fc66de6194c208cd8f73ba411d9f1a"} Dec 01 11:05:18 crc kubenswrapper[4909]: I1201 11:05:18.119382 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" podStartSLOduration=1.7131902879999998 podStartE2EDuration="3.11936318s" podCreationTimestamp="2025-12-01 11:05:15 +0000 UTC" firstStartedPulling="2025-12-01 11:05:16.063849127 +0000 UTC m=+2033.298320025" lastFinishedPulling="2025-12-01 11:05:17.470022019 +0000 UTC m=+2034.704492917" observedRunningTime="2025-12-01 11:05:18.111092929 +0000 UTC m=+2035.345563837" watchObservedRunningTime="2025-12-01 11:05:18.11936318 +0000 UTC m=+2035.353834078" Dec 01 11:05:42 crc kubenswrapper[4909]: I1201 11:05:42.336591 4909 generic.go:334] "Generic (PLEG): container finished" podID="a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" containerID="a770cd804b600fe7a25fe3ecdeb26e0966fc66de6194c208cd8f73ba411d9f1a" exitCode=0 Dec 01 11:05:42 crc kubenswrapper[4909]: I1201 11:05:42.336573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" event={"ID":"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8","Type":"ContainerDied","Data":"a770cd804b600fe7a25fe3ecdeb26e0966fc66de6194c208cd8f73ba411d9f1a"} Dec 01 11:05:42 crc kubenswrapper[4909]: I1201 11:05:42.396368 4909 scope.go:117] "RemoveContainer" containerID="3e97f64bc055c7c00e582532c5a6db76a4a39df8226c08a08996370a7861e495" Dec 01 11:05:42 crc kubenswrapper[4909]: I1201 11:05:42.449199 4909 scope.go:117] "RemoveContainer" containerID="558f8db742b7478f3c474863faf382206ca28f5350ab0b35d73a2c6e226de0b3" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.711887 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.750134 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph\") pod \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.750383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key\") pod \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.750555 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqxsm\" (UniqueName: \"kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm\") pod \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.750608 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory\") pod \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\" (UID: \"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8\") " Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.757480 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm" (OuterVolumeSpecName: "kube-api-access-fqxsm") pod "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" (UID: "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8"). InnerVolumeSpecName "kube-api-access-fqxsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.758052 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph" (OuterVolumeSpecName: "ceph") pod "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" (UID: "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.780611 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory" (OuterVolumeSpecName: "inventory") pod "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" (UID: "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.782258 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" (UID: "a884bb23-e602-4578-8fa9-ac8b6fdf5eb8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.853448 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqxsm\" (UniqueName: \"kubernetes.io/projected/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-kube-api-access-fqxsm\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.853511 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.853521 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:43 crc kubenswrapper[4909]: I1201 11:05:43.853531 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a884bb23-e602-4578-8fa9-ac8b6fdf5eb8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.368145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" event={"ID":"a884bb23-e602-4578-8fa9-ac8b6fdf5eb8","Type":"ContainerDied","Data":"a629eb6054f4a21d5ca33abe83310e03ac8a2e5435375a65e620c16c0fe17e51"} Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.368740 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a629eb6054f4a21d5ca33abe83310e03ac8a2e5435375a65e620c16c0fe17e51" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.368193 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-562bw" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.431936 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5"] Dec 01 11:05:44 crc kubenswrapper[4909]: E1201 11:05:44.432344 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.432365 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.432556 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a884bb23-e602-4578-8fa9-ac8b6fdf5eb8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.433253 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.441392 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.441506 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.441844 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.441927 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.442009 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.446342 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5"] Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.465378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.465474 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwfb4\" (UniqueName: \"kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.465586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.465677 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.566684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.566789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.566848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.566913 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfb4\" (UniqueName: \"kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.572326 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.572499 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.574525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.585054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfb4\" (UniqueName: \"kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:44 crc kubenswrapper[4909]: I1201 11:05:44.759157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:45 crc kubenswrapper[4909]: I1201 11:05:45.250758 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5"] Dec 01 11:05:45 crc kubenswrapper[4909]: I1201 11:05:45.376285 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" event={"ID":"73edb31c-ae80-4a89-96a4-496594406256","Type":"ContainerStarted","Data":"cdad30291d75dc46ff38d023645b344de806452f0aa41a306d9046361a0c991a"} Dec 01 11:05:46 crc kubenswrapper[4909]: I1201 11:05:46.385085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" event={"ID":"73edb31c-ae80-4a89-96a4-496594406256","Type":"ContainerStarted","Data":"40e3ce9b5f60e275bb070534f15488c608bf44a56568ed235f51f4053ceff015"} Dec 01 11:05:46 crc kubenswrapper[4909]: I1201 11:05:46.406966 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" podStartSLOduration=1.94888583 podStartE2EDuration="2.406948282s" podCreationTimestamp="2025-12-01 11:05:44 +0000 UTC" firstStartedPulling="2025-12-01 11:05:45.257225096 +0000 UTC m=+2062.491695994" lastFinishedPulling="2025-12-01 11:05:45.715287548 +0000 UTC m=+2062.949758446" observedRunningTime="2025-12-01 11:05:46.399619269 +0000 UTC m=+2063.634090177" watchObservedRunningTime="2025-12-01 11:05:46.406948282 +0000 UTC m=+2063.641419180" Dec 01 11:05:51 crc kubenswrapper[4909]: I1201 11:05:51.444144 4909 generic.go:334] "Generic (PLEG): container finished" podID="73edb31c-ae80-4a89-96a4-496594406256" containerID="40e3ce9b5f60e275bb070534f15488c608bf44a56568ed235f51f4053ceff015" exitCode=0 Dec 01 11:05:51 crc kubenswrapper[4909]: I1201 11:05:51.444249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" event={"ID":"73edb31c-ae80-4a89-96a4-496594406256","Type":"ContainerDied","Data":"40e3ce9b5f60e275bb070534f15488c608bf44a56568ed235f51f4053ceff015"} Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.860492 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.937544 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory\") pod \"73edb31c-ae80-4a89-96a4-496594406256\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.938099 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph\") pod \"73edb31c-ae80-4a89-96a4-496594406256\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.938189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key\") pod \"73edb31c-ae80-4a89-96a4-496594406256\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.938364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwfb4\" (UniqueName: \"kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4\") pod \"73edb31c-ae80-4a89-96a4-496594406256\" (UID: \"73edb31c-ae80-4a89-96a4-496594406256\") " Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.943192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph" (OuterVolumeSpecName: "ceph") pod "73edb31c-ae80-4a89-96a4-496594406256" (UID: "73edb31c-ae80-4a89-96a4-496594406256"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.962418 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4" (OuterVolumeSpecName: "kube-api-access-hwfb4") pod "73edb31c-ae80-4a89-96a4-496594406256" (UID: "73edb31c-ae80-4a89-96a4-496594406256"). InnerVolumeSpecName "kube-api-access-hwfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.974188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory" (OuterVolumeSpecName: "inventory") pod "73edb31c-ae80-4a89-96a4-496594406256" (UID: "73edb31c-ae80-4a89-96a4-496594406256"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:52 crc kubenswrapper[4909]: I1201 11:05:52.974207 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73edb31c-ae80-4a89-96a4-496594406256" (UID: "73edb31c-ae80-4a89-96a4-496594406256"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.041578 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwfb4\" (UniqueName: \"kubernetes.io/projected/73edb31c-ae80-4a89-96a4-496594406256-kube-api-access-hwfb4\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.041731 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.041744 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.041756 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73edb31c-ae80-4a89-96a4-496594406256-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.468460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" event={"ID":"73edb31c-ae80-4a89-96a4-496594406256","Type":"ContainerDied","Data":"cdad30291d75dc46ff38d023645b344de806452f0aa41a306d9046361a0c991a"} Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.469129 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdad30291d75dc46ff38d023645b344de806452f0aa41a306d9046361a0c991a" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.468586 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.547561 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh"] Dec 01 11:05:53 crc kubenswrapper[4909]: E1201 11:05:53.548206 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73edb31c-ae80-4a89-96a4-496594406256" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.548237 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="73edb31c-ae80-4a89-96a4-496594406256" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.548511 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="73edb31c-ae80-4a89-96a4-496594406256" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.549590 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.552971 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.553291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.553470 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.553710 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.556806 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.562952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh"] Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.656601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.657026 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v44w\" (UniqueName: \"kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.657148 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.657339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.759951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.760027 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v44w\" (UniqueName: \"kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.760068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.760115 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.767376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.768617 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.769738 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.780779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v44w\" (UniqueName: \"kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-72glh\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:53 crc kubenswrapper[4909]: I1201 11:05:53.871135 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:05:54 crc kubenswrapper[4909]: I1201 11:05:54.440254 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh"] Dec 01 11:05:54 crc kubenswrapper[4909]: I1201 11:05:54.477252 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" event={"ID":"bb6d5a74-703d-4f32-9066-61a28fbad67f","Type":"ContainerStarted","Data":"cc5b3db70286d2f1431a438793d365dbc810353df58e72bca7c5c77ebbe48310"} Dec 01 11:05:55 crc kubenswrapper[4909]: I1201 11:05:55.488961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" event={"ID":"bb6d5a74-703d-4f32-9066-61a28fbad67f","Type":"ContainerStarted","Data":"4ebae50546d9d30d773cd2e58d7dee6912a66cbdf8c67e0911b0fbd1bc87cea8"} Dec 01 11:05:55 crc kubenswrapper[4909]: I1201 11:05:55.511734 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" podStartSLOduration=2.0163363 podStartE2EDuration="2.511718962s" podCreationTimestamp="2025-12-01 11:05:53 +0000 UTC" firstStartedPulling="2025-12-01 11:05:54.449012675 +0000 UTC m=+2071.683483563" lastFinishedPulling="2025-12-01 11:05:54.944395337 +0000 UTC m=+2072.178866225" observedRunningTime="2025-12-01 11:05:55.509187884 +0000 UTC m=+2072.743658782" watchObservedRunningTime="2025-12-01 11:05:55.511718962 +0000 UTC m=+2072.746189860" Dec 01 11:06:29 crc kubenswrapper[4909]: I1201 11:06:29.764844 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb6d5a74-703d-4f32-9066-61a28fbad67f" containerID="4ebae50546d9d30d773cd2e58d7dee6912a66cbdf8c67e0911b0fbd1bc87cea8" exitCode=0 Dec 01 11:06:29 crc kubenswrapper[4909]: I1201 11:06:29.764904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" event={"ID":"bb6d5a74-703d-4f32-9066-61a28fbad67f","Type":"ContainerDied","Data":"4ebae50546d9d30d773cd2e58d7dee6912a66cbdf8c67e0911b0fbd1bc87cea8"} Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.200459 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.236532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph\") pod \"bb6d5a74-703d-4f32-9066-61a28fbad67f\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.236629 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v44w\" (UniqueName: \"kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w\") pod \"bb6d5a74-703d-4f32-9066-61a28fbad67f\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.236670 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key\") pod \"bb6d5a74-703d-4f32-9066-61a28fbad67f\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.236692 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory\") pod \"bb6d5a74-703d-4f32-9066-61a28fbad67f\" (UID: \"bb6d5a74-703d-4f32-9066-61a28fbad67f\") " Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.243526 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w" (OuterVolumeSpecName: "kube-api-access-9v44w") pod "bb6d5a74-703d-4f32-9066-61a28fbad67f" (UID: "bb6d5a74-703d-4f32-9066-61a28fbad67f"). InnerVolumeSpecName "kube-api-access-9v44w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.245279 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph" (OuterVolumeSpecName: "ceph") pod "bb6d5a74-703d-4f32-9066-61a28fbad67f" (UID: "bb6d5a74-703d-4f32-9066-61a28fbad67f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.270913 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory" (OuterVolumeSpecName: "inventory") pod "bb6d5a74-703d-4f32-9066-61a28fbad67f" (UID: "bb6d5a74-703d-4f32-9066-61a28fbad67f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.272405 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb6d5a74-703d-4f32-9066-61a28fbad67f" (UID: "bb6d5a74-703d-4f32-9066-61a28fbad67f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.337835 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v44w\" (UniqueName: \"kubernetes.io/projected/bb6d5a74-703d-4f32-9066-61a28fbad67f-kube-api-access-9v44w\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.337862 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.337870 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.337905 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb6d5a74-703d-4f32-9066-61a28fbad67f-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.783489 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" event={"ID":"bb6d5a74-703d-4f32-9066-61a28fbad67f","Type":"ContainerDied","Data":"cc5b3db70286d2f1431a438793d365dbc810353df58e72bca7c5c77ebbe48310"} Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.783904 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5b3db70286d2f1431a438793d365dbc810353df58e72bca7c5c77ebbe48310" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.783585 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-72glh" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.956637 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf"] Dec 01 11:06:31 crc kubenswrapper[4909]: E1201 11:06:31.957235 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6d5a74-703d-4f32-9066-61a28fbad67f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.957262 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6d5a74-703d-4f32-9066-61a28fbad67f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.957573 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6d5a74-703d-4f32-9066-61a28fbad67f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.958363 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.961830 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.962022 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.962180 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.962237 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.969111 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf"] Dec 01 11:06:31 crc kubenswrapper[4909]: I1201 11:06:31.969369 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.048498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.048805 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.048938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.048989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6jr\" (UniqueName: \"kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.150069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.150122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.150156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km6jr\" (UniqueName: \"kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.150230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.156660 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.160487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.162305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.176287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6jr\" (UniqueName: \"kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.286173 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.526454 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.532489 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.537521 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.558859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqcv\" (UniqueName: \"kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.559298 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.560245 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.661410 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.661608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqcv\" (UniqueName: \"kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.661643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.662403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.662688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.691605 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqcv\" (UniqueName: \"kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv\") pod \"redhat-marketplace-8k2pv\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.800393 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf"] Dec 01 11:06:32 crc kubenswrapper[4909]: I1201 11:06:32.868017 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.377078 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.809844 4909 generic.go:334] "Generic (PLEG): container finished" podID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerID="37aa7b151dafe8994c7138e2fa2ebde64eaa0d3aca4a599395d528d1ca1c89d1" exitCode=0 Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.810023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerDied","Data":"37aa7b151dafe8994c7138e2fa2ebde64eaa0d3aca4a599395d528d1ca1c89d1"} Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.810251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerStarted","Data":"a3c35153fd6456ec7a431f7487f57ccdb9460fa5903b73bb2a6bbc5897ac0c12"} Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.812476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" event={"ID":"992fb0af-fdd9-464d-92cc-454f8cc2cfb4","Type":"ContainerStarted","Data":"0f452bd2124d2ee5916cfe79fd0c53036629c897cdab9dcfee22e64cda830a74"} Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.812506 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" event={"ID":"992fb0af-fdd9-464d-92cc-454f8cc2cfb4","Type":"ContainerStarted","Data":"aa84484b5b2cca9e5ea209bc003049534f6c40b28a8654216237f2f4d5e7b31c"} Dec 01 11:06:33 crc kubenswrapper[4909]: I1201 11:06:33.859185 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" podStartSLOduration=2.244538477 podStartE2EDuration="2.859163655s" podCreationTimestamp="2025-12-01 11:06:31 +0000 UTC" firstStartedPulling="2025-12-01 11:06:32.807282697 +0000 UTC m=+2110.041753595" lastFinishedPulling="2025-12-01 11:06:33.421907865 +0000 UTC m=+2110.656378773" observedRunningTime="2025-12-01 11:06:33.845040266 +0000 UTC m=+2111.079511194" watchObservedRunningTime="2025-12-01 11:06:33.859163655 +0000 UTC m=+2111.093634563" Dec 01 11:06:34 crc kubenswrapper[4909]: I1201 11:06:34.822376 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerStarted","Data":"7d335dbdea25089abddd3d12b49ccb92b73c1cee6674841bf26af316db768cb7"} Dec 01 11:06:35 crc kubenswrapper[4909]: I1201 11:06:35.846473 4909 generic.go:334] "Generic (PLEG): container finished" podID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerID="7d335dbdea25089abddd3d12b49ccb92b73c1cee6674841bf26af316db768cb7" exitCode=0 Dec 01 11:06:35 crc kubenswrapper[4909]: I1201 11:06:35.848209 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerDied","Data":"7d335dbdea25089abddd3d12b49ccb92b73c1cee6674841bf26af316db768cb7"} Dec 01 11:06:36 crc kubenswrapper[4909]: I1201 11:06:36.860528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerStarted","Data":"f3b2977369228d2fdc17a335c391c1abe5a57e2942fc32e1d6e21e2e59de019c"} Dec 01 11:06:36 crc kubenswrapper[4909]: I1201 11:06:36.889525 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8k2pv" podStartSLOduration=2.290757852 podStartE2EDuration="4.889508598s" podCreationTimestamp="2025-12-01 11:06:32 +0000 UTC" firstStartedPulling="2025-12-01 11:06:33.811768148 +0000 UTC m=+2111.046239046" lastFinishedPulling="2025-12-01 11:06:36.410518894 +0000 UTC m=+2113.644989792" observedRunningTime="2025-12-01 11:06:36.879886957 +0000 UTC m=+2114.114357855" watchObservedRunningTime="2025-12-01 11:06:36.889508598 +0000 UTC m=+2114.123979496" Dec 01 11:06:37 crc kubenswrapper[4909]: I1201 11:06:37.870468 4909 generic.go:334] "Generic (PLEG): container finished" podID="992fb0af-fdd9-464d-92cc-454f8cc2cfb4" containerID="0f452bd2124d2ee5916cfe79fd0c53036629c897cdab9dcfee22e64cda830a74" exitCode=0 Dec 01 11:06:37 crc kubenswrapper[4909]: I1201 11:06:37.870539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" event={"ID":"992fb0af-fdd9-464d-92cc-454f8cc2cfb4","Type":"ContainerDied","Data":"0f452bd2124d2ee5916cfe79fd0c53036629c897cdab9dcfee22e64cda830a74"} Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.291448 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.395566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km6jr\" (UniqueName: \"kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr\") pod \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.395783 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key\") pod \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.395919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory\") pod \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.395987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph\") pod \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\" (UID: \"992fb0af-fdd9-464d-92cc-454f8cc2cfb4\") " Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.401792 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr" (OuterVolumeSpecName: "kube-api-access-km6jr") pod "992fb0af-fdd9-464d-92cc-454f8cc2cfb4" (UID: "992fb0af-fdd9-464d-92cc-454f8cc2cfb4"). InnerVolumeSpecName "kube-api-access-km6jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.406916 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph" (OuterVolumeSpecName: "ceph") pod "992fb0af-fdd9-464d-92cc-454f8cc2cfb4" (UID: "992fb0af-fdd9-464d-92cc-454f8cc2cfb4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.424837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "992fb0af-fdd9-464d-92cc-454f8cc2cfb4" (UID: "992fb0af-fdd9-464d-92cc-454f8cc2cfb4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.425432 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory" (OuterVolumeSpecName: "inventory") pod "992fb0af-fdd9-464d-92cc-454f8cc2cfb4" (UID: "992fb0af-fdd9-464d-92cc-454f8cc2cfb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.497410 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.497490 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km6jr\" (UniqueName: \"kubernetes.io/projected/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-kube-api-access-km6jr\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.497507 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.497558 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992fb0af-fdd9-464d-92cc-454f8cc2cfb4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.887226 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" event={"ID":"992fb0af-fdd9-464d-92cc-454f8cc2cfb4","Type":"ContainerDied","Data":"aa84484b5b2cca9e5ea209bc003049534f6c40b28a8654216237f2f4d5e7b31c"} Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.887265 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa84484b5b2cca9e5ea209bc003049534f6c40b28a8654216237f2f4d5e7b31c" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.887283 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.959742 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9"] Dec 01 11:06:39 crc kubenswrapper[4909]: E1201 11:06:39.960176 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992fb0af-fdd9-464d-92cc-454f8cc2cfb4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.960191 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="992fb0af-fdd9-464d-92cc-454f8cc2cfb4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.960360 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="992fb0af-fdd9-464d-92cc-454f8cc2cfb4" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.960996 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.964381 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.964733 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.964825 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.966619 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.967762 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:06:39 crc kubenswrapper[4909]: I1201 11:06:39.986209 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9"] Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.008180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2ml\" (UniqueName: \"kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.008237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.008322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.008396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.109680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2ml\" (UniqueName: \"kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.109737 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.109809 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.109901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.114287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.114486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.114685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.129249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2ml\" (UniqueName: \"kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b24p9\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.276814 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.793580 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9"] Dec 01 11:06:40 crc kubenswrapper[4909]: I1201 11:06:40.898609 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" event={"ID":"df51a24f-6b29-49b3-bdee-153cb29154fe","Type":"ContainerStarted","Data":"a4d2ba5c93a7e5339986c68a297a3d404afec73b21b95256d05ca5ac9c45311e"} Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.437343 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.439907 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.463108 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.531762 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.531963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.532128 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6tcq\" (UniqueName: \"kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.633520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.633991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6tcq\" (UniqueName: \"kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.634173 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.634685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.635342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.655535 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6tcq\" (UniqueName: \"kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq\") pod \"certified-operators-f74bk\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.767216 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:41 crc kubenswrapper[4909]: I1201 11:06:41.972268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" event={"ID":"df51a24f-6b29-49b3-bdee-153cb29154fe","Type":"ContainerStarted","Data":"e353a3afe0df4ff2c9028f6bfdbce3ce1ca702a34c19433db336baea13b34ac8"} Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.007731 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" podStartSLOduration=2.5338527969999998 podStartE2EDuration="3.007713806s" podCreationTimestamp="2025-12-01 11:06:39 +0000 UTC" firstStartedPulling="2025-12-01 11:06:40.803147489 +0000 UTC m=+2118.037618387" lastFinishedPulling="2025-12-01 11:06:41.277008498 +0000 UTC m=+2118.511479396" observedRunningTime="2025-12-01 11:06:42.000716334 +0000 UTC m=+2119.235187252" watchObservedRunningTime="2025-12-01 11:06:42.007713806 +0000 UTC m=+2119.242184704" Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.363963 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.531276 4909 scope.go:117] "RemoveContainer" containerID="bc059a122b879bbbdfbacd723b14a28978f5ed4be95062204943e238a5282049" Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.868251 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.868608 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.914047 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.982422 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b233541-55a0-4306-8539-4ea2839d2df2" containerID="9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf" exitCode=0 Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.982528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerDied","Data":"9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf"} Dec 01 11:06:42 crc kubenswrapper[4909]: I1201 11:06:42.982615 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerStarted","Data":"92542ffc2c16e006f2fba8289dea375cdc2a4956a411ade460c62186439414a4"} Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.042854 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.636420 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.638735 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.646370 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.779431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgq4n\" (UniqueName: \"kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.780293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.780350 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.882528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.882616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.882797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgq4n\" (UniqueName: \"kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.883418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.883565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.906504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgq4n\" (UniqueName: \"kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n\") pod \"community-operators-6mmvw\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:43 crc kubenswrapper[4909]: I1201 11:06:43.956378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:44 crc kubenswrapper[4909]: I1201 11:06:44.011142 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerStarted","Data":"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27"} Dec 01 11:06:44 crc kubenswrapper[4909]: I1201 11:06:44.312170 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:06:44 crc kubenswrapper[4909]: W1201 11:06:44.334578 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc359361c_f3c8_4383_9738_f3858ef23e33.slice/crio-67900faa1ef0559978584007756ddcb44ee5794e574d1cfc06b996dcbaf23365 WatchSource:0}: Error finding container 67900faa1ef0559978584007756ddcb44ee5794e574d1cfc06b996dcbaf23365: Status 404 returned error can't find the container with id 67900faa1ef0559978584007756ddcb44ee5794e574d1cfc06b996dcbaf23365 Dec 01 11:06:44 crc kubenswrapper[4909]: E1201 11:06:44.420727 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b233541_55a0_4306_8539_4ea2839d2df2.slice/crio-3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27.scope\": RecentStats: unable to find data in memory cache]" Dec 01 11:06:45 crc kubenswrapper[4909]: I1201 11:06:45.020490 4909 generic.go:334] "Generic (PLEG): container finished" podID="c359361c-f3c8-4383-9738-f3858ef23e33" containerID="c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3" exitCode=0 Dec 01 11:06:45 crc kubenswrapper[4909]: I1201 11:06:45.020575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerDied","Data":"c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3"} Dec 01 11:06:45 crc kubenswrapper[4909]: I1201 11:06:45.020624 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerStarted","Data":"67900faa1ef0559978584007756ddcb44ee5794e574d1cfc06b996dcbaf23365"} Dec 01 11:06:45 crc kubenswrapper[4909]: I1201 11:06:45.023726 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b233541-55a0-4306-8539-4ea2839d2df2" containerID="3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27" exitCode=0 Dec 01 11:06:45 crc kubenswrapper[4909]: I1201 11:06:45.023767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerDied","Data":"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27"} Dec 01 11:06:46 crc kubenswrapper[4909]: I1201 11:06:46.034435 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerStarted","Data":"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473"} Dec 01 11:06:46 crc kubenswrapper[4909]: I1201 11:06:46.061018 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f74bk" podStartSLOduration=2.446866678 podStartE2EDuration="5.06100255s" podCreationTimestamp="2025-12-01 11:06:41 +0000 UTC" firstStartedPulling="2025-12-01 11:06:42.98493059 +0000 UTC m=+2120.219401508" lastFinishedPulling="2025-12-01 11:06:45.599066482 +0000 UTC m=+2122.833537380" observedRunningTime="2025-12-01 11:06:46.056787153 +0000 UTC m=+2123.291258071" watchObservedRunningTime="2025-12-01 11:06:46.06100255 +0000 UTC m=+2123.295473448" Dec 01 11:06:46 crc kubenswrapper[4909]: I1201 11:06:46.831024 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:46 crc kubenswrapper[4909]: I1201 11:06:46.831295 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8k2pv" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="registry-server" containerID="cri-o://f3b2977369228d2fdc17a335c391c1abe5a57e2942fc32e1d6e21e2e59de019c" gracePeriod=2 Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.101562 4909 generic.go:334] "Generic (PLEG): container finished" podID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerID="f3b2977369228d2fdc17a335c391c1abe5a57e2942fc32e1d6e21e2e59de019c" exitCode=0 Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.101812 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerDied","Data":"f3b2977369228d2fdc17a335c391c1abe5a57e2942fc32e1d6e21e2e59de019c"} Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.350933 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.466757 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities\") pod \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.467108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content\") pod \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.467408 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfqcv\" (UniqueName: \"kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv\") pod \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\" (UID: \"98802c8d-3009-4095-9e8f-ac3ecbbfda12\") " Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.468007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities" (OuterVolumeSpecName: "utilities") pod "98802c8d-3009-4095-9e8f-ac3ecbbfda12" (UID: "98802c8d-3009-4095-9e8f-ac3ecbbfda12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.468941 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.475776 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv" (OuterVolumeSpecName: "kube-api-access-bfqcv") pod "98802c8d-3009-4095-9e8f-ac3ecbbfda12" (UID: "98802c8d-3009-4095-9e8f-ac3ecbbfda12"). InnerVolumeSpecName "kube-api-access-bfqcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.483701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98802c8d-3009-4095-9e8f-ac3ecbbfda12" (UID: "98802c8d-3009-4095-9e8f-ac3ecbbfda12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.570831 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98802c8d-3009-4095-9e8f-ac3ecbbfda12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:47 crc kubenswrapper[4909]: I1201 11:06:47.570889 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfqcv\" (UniqueName: \"kubernetes.io/projected/98802c8d-3009-4095-9e8f-ac3ecbbfda12-kube-api-access-bfqcv\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:48 crc kubenswrapper[4909]: I1201 11:06:48.114705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8k2pv" event={"ID":"98802c8d-3009-4095-9e8f-ac3ecbbfda12","Type":"ContainerDied","Data":"a3c35153fd6456ec7a431f7487f57ccdb9460fa5903b73bb2a6bbc5897ac0c12"} Dec 01 11:06:48 crc kubenswrapper[4909]: I1201 11:06:48.114764 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8k2pv" Dec 01 11:06:48 crc kubenswrapper[4909]: I1201 11:06:48.114768 4909 scope.go:117] "RemoveContainer" containerID="f3b2977369228d2fdc17a335c391c1abe5a57e2942fc32e1d6e21e2e59de019c" Dec 01 11:06:48 crc kubenswrapper[4909]: I1201 11:06:48.153411 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:48 crc kubenswrapper[4909]: I1201 11:06:48.160537 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8k2pv"] Dec 01 11:06:49 crc kubenswrapper[4909]: I1201 11:06:49.273362 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" path="/var/lib/kubelet/pods/98802c8d-3009-4095-9e8f-ac3ecbbfda12/volumes" Dec 01 11:06:49 crc kubenswrapper[4909]: I1201 11:06:49.317989 4909 scope.go:117] "RemoveContainer" containerID="7d335dbdea25089abddd3d12b49ccb92b73c1cee6674841bf26af316db768cb7" Dec 01 11:06:49 crc kubenswrapper[4909]: I1201 11:06:49.376598 4909 scope.go:117] "RemoveContainer" containerID="37aa7b151dafe8994c7138e2fa2ebde64eaa0d3aca4a599395d528d1ca1c89d1" Dec 01 11:06:50 crc kubenswrapper[4909]: I1201 11:06:50.134545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerStarted","Data":"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7"} Dec 01 11:06:51 crc kubenswrapper[4909]: I1201 11:06:51.148238 4909 generic.go:334] "Generic (PLEG): container finished" podID="c359361c-f3c8-4383-9738-f3858ef23e33" containerID="3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7" exitCode=0 Dec 01 11:06:51 crc kubenswrapper[4909]: I1201 11:06:51.148342 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerDied","Data":"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7"} Dec 01 11:06:51 crc kubenswrapper[4909]: I1201 11:06:51.767435 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:51 crc kubenswrapper[4909]: I1201 11:06:51.768945 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:51 crc kubenswrapper[4909]: I1201 11:06:51.826577 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:52 crc kubenswrapper[4909]: I1201 11:06:52.162535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerStarted","Data":"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184"} Dec 01 11:06:52 crc kubenswrapper[4909]: I1201 11:06:52.186496 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6mmvw" podStartSLOduration=2.568883586 podStartE2EDuration="9.186472512s" podCreationTimestamp="2025-12-01 11:06:43 +0000 UTC" firstStartedPulling="2025-12-01 11:06:45.022612862 +0000 UTC m=+2122.257083760" lastFinishedPulling="2025-12-01 11:06:51.640201788 +0000 UTC m=+2128.874672686" observedRunningTime="2025-12-01 11:06:52.181904274 +0000 UTC m=+2129.416375182" watchObservedRunningTime="2025-12-01 11:06:52.186472512 +0000 UTC m=+2129.420943410" Dec 01 11:06:52 crc kubenswrapper[4909]: I1201 11:06:52.211732 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:53 crc kubenswrapper[4909]: I1201 11:06:53.956487 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:53 crc kubenswrapper[4909]: I1201 11:06:53.956831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:54 crc kubenswrapper[4909]: I1201 11:06:54.006865 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:06:54 crc kubenswrapper[4909]: I1201 11:06:54.426397 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.187204 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f74bk" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="registry-server" containerID="cri-o://421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473" gracePeriod=2 Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.755024 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.886389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content\") pod \"4b233541-55a0-4306-8539-4ea2839d2df2\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.886507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities\") pod \"4b233541-55a0-4306-8539-4ea2839d2df2\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.886745 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6tcq\" (UniqueName: \"kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq\") pod \"4b233541-55a0-4306-8539-4ea2839d2df2\" (UID: \"4b233541-55a0-4306-8539-4ea2839d2df2\") " Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.887713 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities" (OuterVolumeSpecName: "utilities") pod "4b233541-55a0-4306-8539-4ea2839d2df2" (UID: "4b233541-55a0-4306-8539-4ea2839d2df2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.897355 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq" (OuterVolumeSpecName: "kube-api-access-c6tcq") pod "4b233541-55a0-4306-8539-4ea2839d2df2" (UID: "4b233541-55a0-4306-8539-4ea2839d2df2"). InnerVolumeSpecName "kube-api-access-c6tcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.934001 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b233541-55a0-4306-8539-4ea2839d2df2" (UID: "4b233541-55a0-4306-8539-4ea2839d2df2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.988903 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6tcq\" (UniqueName: \"kubernetes.io/projected/4b233541-55a0-4306-8539-4ea2839d2df2-kube-api-access-c6tcq\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.988958 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:55 crc kubenswrapper[4909]: I1201 11:06:55.988974 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b233541-55a0-4306-8539-4ea2839d2df2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.197605 4909 generic.go:334] "Generic (PLEG): container finished" podID="4b233541-55a0-4306-8539-4ea2839d2df2" containerID="421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473" exitCode=0 Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.197653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerDied","Data":"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473"} Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.197665 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f74bk" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.197685 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f74bk" event={"ID":"4b233541-55a0-4306-8539-4ea2839d2df2","Type":"ContainerDied","Data":"92542ffc2c16e006f2fba8289dea375cdc2a4956a411ade460c62186439414a4"} Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.197706 4909 scope.go:117] "RemoveContainer" containerID="421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.218248 4909 scope.go:117] "RemoveContainer" containerID="3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.240024 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.242742 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f74bk"] Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.258622 4909 scope.go:117] "RemoveContainer" containerID="9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.291402 4909 scope.go:117] "RemoveContainer" containerID="421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473" Dec 01 11:06:56 crc kubenswrapper[4909]: E1201 11:06:56.292002 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473\": container with ID starting with 421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473 not found: ID does not exist" containerID="421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.292048 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473"} err="failed to get container status \"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473\": rpc error: code = NotFound desc = could not find container \"421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473\": container with ID starting with 421fdc57d3d9621f1afc079d137c03dd54cf23c617cd2c37b7865c05ac8eb473 not found: ID does not exist" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.292077 4909 scope.go:117] "RemoveContainer" containerID="3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27" Dec 01 11:06:56 crc kubenswrapper[4909]: E1201 11:06:56.292521 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27\": container with ID starting with 3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27 not found: ID does not exist" containerID="3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.292553 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27"} err="failed to get container status \"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27\": rpc error: code = NotFound desc = could not find container \"3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27\": container with ID starting with 3cdcfa7c46e329aef5ba74a98080de27dc7b4e39b4cc52f6f9745ca72bda8b27 not found: ID does not exist" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.292574 4909 scope.go:117] "RemoveContainer" containerID="9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf" Dec 01 11:06:56 crc kubenswrapper[4909]: E1201 11:06:56.292895 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf\": container with ID starting with 9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf not found: ID does not exist" containerID="9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf" Dec 01 11:06:56 crc kubenswrapper[4909]: I1201 11:06:56.292934 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf"} err="failed to get container status \"9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf\": rpc error: code = NotFound desc = could not find container \"9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf\": container with ID starting with 9ce09fd26f03c81bdf61187cfa00f7b8914d79638905ff0f44d3342f08eb1ddf not found: ID does not exist" Dec 01 11:06:57 crc kubenswrapper[4909]: I1201 11:06:57.266712 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" path="/var/lib/kubelet/pods/4b233541-55a0-4306-8539-4ea2839d2df2/volumes" Dec 01 11:07:04 crc kubenswrapper[4909]: I1201 11:07:04.015533 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:07:05 crc kubenswrapper[4909]: I1201 11:07:05.888981 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:07:06 crc kubenswrapper[4909]: I1201 11:07:06.193292 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:06 crc kubenswrapper[4909]: I1201 11:07:06.193398 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:06 crc kubenswrapper[4909]: I1201 11:07:06.263844 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 11:07:06 crc kubenswrapper[4909]: I1201 11:07:06.264133 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knj4t" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="registry-server" containerID="cri-o://93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e" gracePeriod=2 Dec 01 11:07:06 crc kubenswrapper[4909]: I1201 11:07:06.883473 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knj4t" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.010326 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content\") pod \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.010705 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities\") pod \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.011016 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqhf\" (UniqueName: \"kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf\") pod \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\" (UID: \"f10aa799-1736-49e3-a39d-d0a61dfbf0c4\") " Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.012019 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities" (OuterVolumeSpecName: "utilities") pod "f10aa799-1736-49e3-a39d-d0a61dfbf0c4" (UID: "f10aa799-1736-49e3-a39d-d0a61dfbf0c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.021175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf" (OuterVolumeSpecName: "kube-api-access-sgqhf") pod "f10aa799-1736-49e3-a39d-d0a61dfbf0c4" (UID: "f10aa799-1736-49e3-a39d-d0a61dfbf0c4"). InnerVolumeSpecName "kube-api-access-sgqhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.061200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f10aa799-1736-49e3-a39d-d0a61dfbf0c4" (UID: "f10aa799-1736-49e3-a39d-d0a61dfbf0c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.112719 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqhf\" (UniqueName: \"kubernetes.io/projected/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-kube-api-access-sgqhf\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.112757 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.112767 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10aa799-1736-49e3-a39d-d0a61dfbf0c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.297215 4909 generic.go:334] "Generic (PLEG): container finished" podID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerID="93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e" exitCode=0 Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.297430 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerDied","Data":"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e"} Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.297696 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knj4t" event={"ID":"f10aa799-1736-49e3-a39d-d0a61dfbf0c4","Type":"ContainerDied","Data":"e4981b0c628ca871c2336698465a81b1062640c2142f1f7862d8cac9d9a258d1"} Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.297566 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knj4t" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.297758 4909 scope.go:117] "RemoveContainer" containerID="93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.323635 4909 scope.go:117] "RemoveContainer" containerID="ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.334513 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.346178 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knj4t"] Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.357701 4909 scope.go:117] "RemoveContainer" containerID="abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.402445 4909 scope.go:117] "RemoveContainer" containerID="93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e" Dec 01 11:07:07 crc kubenswrapper[4909]: E1201 11:07:07.403419 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e\": container with ID starting with 93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e not found: ID does not exist" containerID="93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.403456 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e"} err="failed to get container status \"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e\": rpc error: code = NotFound desc = could not find container \"93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e\": container with ID starting with 93bf90c8fc71f5eb567eb02a78022245e10242c2e696e6dd5c530386326e523e not found: ID does not exist" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.403484 4909 scope.go:117] "RemoveContainer" containerID="ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8" Dec 01 11:07:07 crc kubenswrapper[4909]: E1201 11:07:07.404898 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8\": container with ID starting with ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8 not found: ID does not exist" containerID="ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.404920 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8"} err="failed to get container status \"ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8\": rpc error: code = NotFound desc = could not find container \"ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8\": container with ID starting with ddbca77349cbf68e35772b7bb6811796bc595e3566f458d67bd175e9ce0ae3f8 not found: ID does not exist" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.404936 4909 scope.go:117] "RemoveContainer" containerID="abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3" Dec 01 11:07:07 crc kubenswrapper[4909]: E1201 11:07:07.405185 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3\": container with ID starting with abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3 not found: ID does not exist" containerID="abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3" Dec 01 11:07:07 crc kubenswrapper[4909]: I1201 11:07:07.405208 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3"} err="failed to get container status \"abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3\": rpc error: code = NotFound desc = could not find container \"abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3\": container with ID starting with abe618aaf94bc7921b48220d2bdc98815bb3e9e874eed1eafa18ad212add54a3 not found: ID does not exist" Dec 01 11:07:09 crc kubenswrapper[4909]: I1201 11:07:09.267978 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" path="/var/lib/kubelet/pods/f10aa799-1736-49e3-a39d-d0a61dfbf0c4/volumes" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.723753 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724850 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.724885 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724904 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.724912 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724923 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.724929 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724948 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.724955 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724977 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.724985 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.724996 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725003 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.725013 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725020 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.725033 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725039 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="extract-content" Dec 01 11:07:22 crc kubenswrapper[4909]: E1201 11:07:22.725061 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725069 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="extract-utilities" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725276 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98802c8d-3009-4095-9e8f-ac3ecbbfda12" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725314 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b233541-55a0-4306-8539-4ea2839d2df2" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.725323 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10aa799-1736-49e3-a39d-d0a61dfbf0c4" containerName="registry-server" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.726813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.739917 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.828807 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.828865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464wz\" (UniqueName: \"kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.829212 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.931534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.931591 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464wz\" (UniqueName: \"kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.931659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.932139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.932361 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:22 crc kubenswrapper[4909]: I1201 11:07:22.965344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464wz\" (UniqueName: \"kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz\") pod \"redhat-operators-smsql\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:23 crc kubenswrapper[4909]: I1201 11:07:23.054128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:23 crc kubenswrapper[4909]: I1201 11:07:23.553480 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:24 crc kubenswrapper[4909]: I1201 11:07:24.458183 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerID="7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2" exitCode=0 Dec 01 11:07:24 crc kubenswrapper[4909]: I1201 11:07:24.458280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerDied","Data":"7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2"} Dec 01 11:07:24 crc kubenswrapper[4909]: I1201 11:07:24.458457 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerStarted","Data":"36dc272222458bce00b7941b8ae16c90fcfa9998f3046369772777e2ccc05c28"} Dec 01 11:07:25 crc kubenswrapper[4909]: I1201 11:07:25.477453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerStarted","Data":"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2"} Dec 01 11:07:27 crc kubenswrapper[4909]: I1201 11:07:27.493546 4909 generic.go:334] "Generic (PLEG): container finished" podID="df51a24f-6b29-49b3-bdee-153cb29154fe" containerID="e353a3afe0df4ff2c9028f6bfdbce3ce1ca702a34c19433db336baea13b34ac8" exitCode=0 Dec 01 11:07:27 crc kubenswrapper[4909]: I1201 11:07:27.493631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" event={"ID":"df51a24f-6b29-49b3-bdee-153cb29154fe","Type":"ContainerDied","Data":"e353a3afe0df4ff2c9028f6bfdbce3ce1ca702a34c19433db336baea13b34ac8"} Dec 01 11:07:27 crc kubenswrapper[4909]: I1201 11:07:27.496170 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerID="0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2" exitCode=0 Dec 01 11:07:27 crc kubenswrapper[4909]: I1201 11:07:27.496407 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerDied","Data":"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2"} Dec 01 11:07:28 crc kubenswrapper[4909]: I1201 11:07:28.942418 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.058419 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory\") pod \"df51a24f-6b29-49b3-bdee-153cb29154fe\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.058742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2ml\" (UniqueName: \"kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml\") pod \"df51a24f-6b29-49b3-bdee-153cb29154fe\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.058799 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key\") pod \"df51a24f-6b29-49b3-bdee-153cb29154fe\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.058896 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph\") pod \"df51a24f-6b29-49b3-bdee-153cb29154fe\" (UID: \"df51a24f-6b29-49b3-bdee-153cb29154fe\") " Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.065966 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml" (OuterVolumeSpecName: "kube-api-access-bk2ml") pod "df51a24f-6b29-49b3-bdee-153cb29154fe" (UID: "df51a24f-6b29-49b3-bdee-153cb29154fe"). InnerVolumeSpecName "kube-api-access-bk2ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.067093 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph" (OuterVolumeSpecName: "ceph") pod "df51a24f-6b29-49b3-bdee-153cb29154fe" (UID: "df51a24f-6b29-49b3-bdee-153cb29154fe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.091263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory" (OuterVolumeSpecName: "inventory") pod "df51a24f-6b29-49b3-bdee-153cb29154fe" (UID: "df51a24f-6b29-49b3-bdee-153cb29154fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.091763 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "df51a24f-6b29-49b3-bdee-153cb29154fe" (UID: "df51a24f-6b29-49b3-bdee-153cb29154fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.160831 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2ml\" (UniqueName: \"kubernetes.io/projected/df51a24f-6b29-49b3-bdee-153cb29154fe-kube-api-access-bk2ml\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.160866 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.160887 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.160896 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df51a24f-6b29-49b3-bdee-153cb29154fe-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.514813 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" event={"ID":"df51a24f-6b29-49b3-bdee-153cb29154fe","Type":"ContainerDied","Data":"a4d2ba5c93a7e5339986c68a297a3d404afec73b21b95256d05ca5ac9c45311e"} Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.515152 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4d2ba5c93a7e5339986c68a297a3d404afec73b21b95256d05ca5ac9c45311e" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.515011 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b24p9" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.517817 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerStarted","Data":"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7"} Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.550915 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smsql" podStartSLOduration=3.259336144 podStartE2EDuration="7.550863453s" podCreationTimestamp="2025-12-01 11:07:22 +0000 UTC" firstStartedPulling="2025-12-01 11:07:24.459699666 +0000 UTC m=+2161.694170564" lastFinishedPulling="2025-12-01 11:07:28.751226975 +0000 UTC m=+2165.985697873" observedRunningTime="2025-12-01 11:07:29.538047885 +0000 UTC m=+2166.772518783" watchObservedRunningTime="2025-12-01 11:07:29.550863453 +0000 UTC m=+2166.785334361" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.605505 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-52zqm"] Dec 01 11:07:29 crc kubenswrapper[4909]: E1201 11:07:29.606061 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df51a24f-6b29-49b3-bdee-153cb29154fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.606086 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="df51a24f-6b29-49b3-bdee-153cb29154fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.606320 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="df51a24f-6b29-49b3-bdee-153cb29154fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.607144 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.609458 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.609607 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.609646 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.609788 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.609887 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.614259 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-52zqm"] Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.671510 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mn2z\" (UniqueName: \"kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.671579 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.671717 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.671833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.774527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.774640 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mn2z\" (UniqueName: \"kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.774669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.774719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.779123 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.781359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.782659 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.797518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mn2z\" (UniqueName: \"kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z\") pod \"ssh-known-hosts-edpm-deployment-52zqm\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:29 crc kubenswrapper[4909]: I1201 11:07:29.924895 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:30 crc kubenswrapper[4909]: I1201 11:07:30.456355 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-52zqm"] Dec 01 11:07:30 crc kubenswrapper[4909]: I1201 11:07:30.527832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" event={"ID":"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef","Type":"ContainerStarted","Data":"226fb5db1e77577afd194330af0e70e3daed10135c64e61b9830a9e8a18f14bd"} Dec 01 11:07:31 crc kubenswrapper[4909]: I1201 11:07:31.540152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" event={"ID":"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef","Type":"ContainerStarted","Data":"b7b9546caba4581311233306b45f2b0283a92ba1a5fdf9162751516867446dfd"} Dec 01 11:07:31 crc kubenswrapper[4909]: I1201 11:07:31.564547 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" podStartSLOduration=2.058464751 podStartE2EDuration="2.564524136s" podCreationTimestamp="2025-12-01 11:07:29 +0000 UTC" firstStartedPulling="2025-12-01 11:07:30.463783917 +0000 UTC m=+2167.698254815" lastFinishedPulling="2025-12-01 11:07:30.969843302 +0000 UTC m=+2168.204314200" observedRunningTime="2025-12-01 11:07:31.561614648 +0000 UTC m=+2168.796085556" watchObservedRunningTime="2025-12-01 11:07:31.564524136 +0000 UTC m=+2168.798995034" Dec 01 11:07:33 crc kubenswrapper[4909]: I1201 11:07:33.054622 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:33 crc kubenswrapper[4909]: I1201 11:07:33.055034 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:34 crc kubenswrapper[4909]: I1201 11:07:34.100156 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smsql" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="registry-server" probeResult="failure" output=< Dec 01 11:07:34 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Dec 01 11:07:34 crc kubenswrapper[4909]: > Dec 01 11:07:36 crc kubenswrapper[4909]: I1201 11:07:36.193956 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:36 crc kubenswrapper[4909]: I1201 11:07:36.194455 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:40 crc kubenswrapper[4909]: I1201 11:07:40.643798 4909 generic.go:334] "Generic (PLEG): container finished" podID="4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" containerID="b7b9546caba4581311233306b45f2b0283a92ba1a5fdf9162751516867446dfd" exitCode=0 Dec 01 11:07:40 crc kubenswrapper[4909]: I1201 11:07:40.643907 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" event={"ID":"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef","Type":"ContainerDied","Data":"b7b9546caba4581311233306b45f2b0283a92ba1a5fdf9162751516867446dfd"} Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.097050 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.147601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mn2z\" (UniqueName: \"kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z\") pod \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.147931 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0\") pod \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.148106 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph\") pod \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.148203 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam\") pod \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\" (UID: \"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef\") " Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.154196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph" (OuterVolumeSpecName: "ceph") pod "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" (UID: "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.158425 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z" (OuterVolumeSpecName: "kube-api-access-2mn2z") pod "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" (UID: "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef"). InnerVolumeSpecName "kube-api-access-2mn2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.174721 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" (UID: "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.175325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" (UID: "4ce039e6-b0e9-43d5-bf88-f4c169bb03ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.249544 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.249576 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.249588 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mn2z\" (UniqueName: \"kubernetes.io/projected/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-kube-api-access-2mn2z\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.249597 4909 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ce039e6-b0e9-43d5-bf88-f4c169bb03ef-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.663934 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" event={"ID":"4ce039e6-b0e9-43d5-bf88-f4c169bb03ef","Type":"ContainerDied","Data":"226fb5db1e77577afd194330af0e70e3daed10135c64e61b9830a9e8a18f14bd"} Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.663983 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="226fb5db1e77577afd194330af0e70e3daed10135c64e61b9830a9e8a18f14bd" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.664090 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-52zqm" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.745407 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4"] Dec 01 11:07:42 crc kubenswrapper[4909]: E1201 11:07:42.745927 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" containerName="ssh-known-hosts-edpm-deployment" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.745946 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" containerName="ssh-known-hosts-edpm-deployment" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.746168 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce039e6-b0e9-43d5-bf88-f4c169bb03ef" containerName="ssh-known-hosts-edpm-deployment" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.747060 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.750094 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.750131 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.750166 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.750641 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.751117 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.755139 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4"] Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.861234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.861305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.861365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w98cg\" (UniqueName: \"kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.861439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.963332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.963382 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.963448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w98cg\" (UniqueName: \"kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.964428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.968090 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.968379 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.969935 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:42 crc kubenswrapper[4909]: I1201 11:07:42.981031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w98cg\" (UniqueName: \"kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5xbt4\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.065077 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.102412 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.175917 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.347240 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.624306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4"] Dec 01 11:07:43 crc kubenswrapper[4909]: I1201 11:07:43.672818 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" event={"ID":"948452ff-597c-4b3d-ba9a-11fb527a3c55","Type":"ContainerStarted","Data":"ca1dfd55b3cb1f9d7961c120dd5274be2180d578bfba4d4dfab44c511e75a551"} Dec 01 11:07:44 crc kubenswrapper[4909]: I1201 11:07:44.686706 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smsql" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="registry-server" containerID="cri-o://8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7" gracePeriod=2 Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.173115 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.328073 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464wz\" (UniqueName: \"kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz\") pod \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.328244 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content\") pod \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.328814 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities\") pod \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\" (UID: \"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805\") " Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.330440 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities" (OuterVolumeSpecName: "utilities") pod "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" (UID: "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.339017 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz" (OuterVolumeSpecName: "kube-api-access-464wz") pod "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" (UID: "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805"). InnerVolumeSpecName "kube-api-access-464wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.432273 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-464wz\" (UniqueName: \"kubernetes.io/projected/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-kube-api-access-464wz\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.432317 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.468863 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" (UID: "bb661c25-da6e-45dc-9c4a-7ceaf7a2c805"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.533780 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.699892 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerID="8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7" exitCode=0 Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.700045 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smsql" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.701801 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerDied","Data":"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7"} Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.701900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smsql" event={"ID":"bb661c25-da6e-45dc-9c4a-7ceaf7a2c805","Type":"ContainerDied","Data":"36dc272222458bce00b7941b8ae16c90fcfa9998f3046369772777e2ccc05c28"} Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.701929 4909 scope.go:117] "RemoveContainer" containerID="8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.707647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" event={"ID":"948452ff-597c-4b3d-ba9a-11fb527a3c55","Type":"ContainerStarted","Data":"a253622508b6a2f537ab80b1f930b35f7a80466209370cc647664e8357e8d497"} Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.732839 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" podStartSLOduration=2.917052677 podStartE2EDuration="3.732803955s" podCreationTimestamp="2025-12-01 11:07:42 +0000 UTC" firstStartedPulling="2025-12-01 11:07:43.631522514 +0000 UTC m=+2180.865993402" lastFinishedPulling="2025-12-01 11:07:44.447273782 +0000 UTC m=+2181.681744680" observedRunningTime="2025-12-01 11:07:45.72671802 +0000 UTC m=+2182.961188918" watchObservedRunningTime="2025-12-01 11:07:45.732803955 +0000 UTC m=+2182.967274903" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.742919 4909 scope.go:117] "RemoveContainer" containerID="0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.765672 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.778566 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smsql"] Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.780918 4909 scope.go:117] "RemoveContainer" containerID="7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.821597 4909 scope.go:117] "RemoveContainer" containerID="8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7" Dec 01 11:07:45 crc kubenswrapper[4909]: E1201 11:07:45.822226 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7\": container with ID starting with 8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7 not found: ID does not exist" containerID="8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.822277 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7"} err="failed to get container status \"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7\": rpc error: code = NotFound desc = could not find container \"8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7\": container with ID starting with 8a28930ca946ba33fd54c6fb87837922b7e59023a67f465f2f63a57ece0176e7 not found: ID does not exist" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.822312 4909 scope.go:117] "RemoveContainer" containerID="0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2" Dec 01 11:07:45 crc kubenswrapper[4909]: E1201 11:07:45.822767 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2\": container with ID starting with 0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2 not found: ID does not exist" containerID="0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.822803 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2"} err="failed to get container status \"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2\": rpc error: code = NotFound desc = could not find container \"0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2\": container with ID starting with 0c24abea93b594d5baad32dcc4935c25f9c1fc0f6bc2c8188327090ad82719e2 not found: ID does not exist" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.822821 4909 scope.go:117] "RemoveContainer" containerID="7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2" Dec 01 11:07:45 crc kubenswrapper[4909]: E1201 11:07:45.823150 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2\": container with ID starting with 7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2 not found: ID does not exist" containerID="7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2" Dec 01 11:07:45 crc kubenswrapper[4909]: I1201 11:07:45.823180 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2"} err="failed to get container status \"7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2\": rpc error: code = NotFound desc = could not find container \"7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2\": container with ID starting with 7400fd1a4682caf1d986fee404de7b5839559a36a13ae20d9e0b33c4faad84e2 not found: ID does not exist" Dec 01 11:07:47 crc kubenswrapper[4909]: I1201 11:07:47.267639 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" path="/var/lib/kubelet/pods/bb661c25-da6e-45dc-9c4a-7ceaf7a2c805/volumes" Dec 01 11:07:51 crc kubenswrapper[4909]: I1201 11:07:51.772619 4909 generic.go:334] "Generic (PLEG): container finished" podID="948452ff-597c-4b3d-ba9a-11fb527a3c55" containerID="a253622508b6a2f537ab80b1f930b35f7a80466209370cc647664e8357e8d497" exitCode=0 Dec 01 11:07:51 crc kubenswrapper[4909]: I1201 11:07:51.772711 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" event={"ID":"948452ff-597c-4b3d-ba9a-11fb527a3c55","Type":"ContainerDied","Data":"a253622508b6a2f537ab80b1f930b35f7a80466209370cc647664e8357e8d497"} Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.335754 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.501297 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w98cg\" (UniqueName: \"kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg\") pod \"948452ff-597c-4b3d-ba9a-11fb527a3c55\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.501729 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph\") pod \"948452ff-597c-4b3d-ba9a-11fb527a3c55\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.501778 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory\") pod \"948452ff-597c-4b3d-ba9a-11fb527a3c55\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.501832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key\") pod \"948452ff-597c-4b3d-ba9a-11fb527a3c55\" (UID: \"948452ff-597c-4b3d-ba9a-11fb527a3c55\") " Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.509910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph" (OuterVolumeSpecName: "ceph") pod "948452ff-597c-4b3d-ba9a-11fb527a3c55" (UID: "948452ff-597c-4b3d-ba9a-11fb527a3c55"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.510080 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg" (OuterVolumeSpecName: "kube-api-access-w98cg") pod "948452ff-597c-4b3d-ba9a-11fb527a3c55" (UID: "948452ff-597c-4b3d-ba9a-11fb527a3c55"). InnerVolumeSpecName "kube-api-access-w98cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.532613 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "948452ff-597c-4b3d-ba9a-11fb527a3c55" (UID: "948452ff-597c-4b3d-ba9a-11fb527a3c55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.532718 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory" (OuterVolumeSpecName: "inventory") pod "948452ff-597c-4b3d-ba9a-11fb527a3c55" (UID: "948452ff-597c-4b3d-ba9a-11fb527a3c55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.604368 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.604417 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w98cg\" (UniqueName: \"kubernetes.io/projected/948452ff-597c-4b3d-ba9a-11fb527a3c55-kube-api-access-w98cg\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.604433 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.604443 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948452ff-597c-4b3d-ba9a-11fb527a3c55-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.789717 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" event={"ID":"948452ff-597c-4b3d-ba9a-11fb527a3c55","Type":"ContainerDied","Data":"ca1dfd55b3cb1f9d7961c120dd5274be2180d578bfba4d4dfab44c511e75a551"} Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.789765 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca1dfd55b3cb1f9d7961c120dd5274be2180d578bfba4d4dfab44c511e75a551" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.789773 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5xbt4" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.953531 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh"] Dec 01 11:07:53 crc kubenswrapper[4909]: E1201 11:07:53.954042 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="extract-content" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954066 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="extract-content" Dec 01 11:07:53 crc kubenswrapper[4909]: E1201 11:07:53.954085 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948452ff-597c-4b3d-ba9a-11fb527a3c55" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954094 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="948452ff-597c-4b3d-ba9a-11fb527a3c55" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:53 crc kubenswrapper[4909]: E1201 11:07:53.954120 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="extract-utilities" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954128 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="extract-utilities" Dec 01 11:07:53 crc kubenswrapper[4909]: E1201 11:07:53.954139 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="registry-server" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954148 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="registry-server" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954376 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb661c25-da6e-45dc-9c4a-7ceaf7a2c805" containerName="registry-server" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.954404 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="948452ff-597c-4b3d-ba9a-11fb527a3c55" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.955868 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.958157 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.959649 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.959691 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.960741 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.970869 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh"] Dec 01 11:07:53 crc kubenswrapper[4909]: I1201 11:07:53.980638 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.013114 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.013166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.013284 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bgj\" (UniqueName: \"kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.013482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.115024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.115086 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.115124 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.115170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bgj\" (UniqueName: \"kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.120257 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.120293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.129031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.132508 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bgj\" (UniqueName: \"kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.313091 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:07:54 crc kubenswrapper[4909]: I1201 11:07:54.841986 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh"] Dec 01 11:07:54 crc kubenswrapper[4909]: W1201 11:07:54.848124 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1a2a204_dcb3_401d_8f9f_aa2d9bf5fbb5.slice/crio-4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56 WatchSource:0}: Error finding container 4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56: Status 404 returned error can't find the container with id 4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56 Dec 01 11:07:55 crc kubenswrapper[4909]: I1201 11:07:55.806238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" event={"ID":"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5","Type":"ContainerStarted","Data":"1a53789010872328c9c844d343eaa46e933bffedfbea32d7e1f483d72e10c196"} Dec 01 11:07:55 crc kubenswrapper[4909]: I1201 11:07:55.806698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" event={"ID":"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5","Type":"ContainerStarted","Data":"4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56"} Dec 01 11:07:55 crc kubenswrapper[4909]: I1201 11:07:55.827808 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" podStartSLOduration=2.324685905 podStartE2EDuration="2.827790352s" podCreationTimestamp="2025-12-01 11:07:53 +0000 UTC" firstStartedPulling="2025-12-01 11:07:54.850234857 +0000 UTC m=+2192.084705755" lastFinishedPulling="2025-12-01 11:07:55.353339304 +0000 UTC m=+2192.587810202" observedRunningTime="2025-12-01 11:07:55.821629895 +0000 UTC m=+2193.056100793" watchObservedRunningTime="2025-12-01 11:07:55.827790352 +0000 UTC m=+2193.062261250" Dec 01 11:08:05 crc kubenswrapper[4909]: I1201 11:08:05.889258 4909 generic.go:334] "Generic (PLEG): container finished" podID="b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" containerID="1a53789010872328c9c844d343eaa46e933bffedfbea32d7e1f483d72e10c196" exitCode=0 Dec 01 11:08:05 crc kubenswrapper[4909]: I1201 11:08:05.889854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" event={"ID":"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5","Type":"ContainerDied","Data":"1a53789010872328c9c844d343eaa46e933bffedfbea32d7e1f483d72e10c196"} Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.193501 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.193566 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.193618 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.194458 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.194514 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" gracePeriod=600 Dec 01 11:08:06 crc kubenswrapper[4909]: E1201 11:08:06.337615 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.900580 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" exitCode=0 Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.900660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08"} Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.900723 4909 scope.go:117] "RemoveContainer" containerID="f41d4464aa3fd8f418a1acd106f20e41fd1625eaf7c86916aa685bf86d68ce5c" Dec 01 11:08:06 crc kubenswrapper[4909]: I1201 11:08:06.901355 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:08:06 crc kubenswrapper[4909]: E1201 11:08:06.901631 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.295646 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.371360 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory\") pod \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.371609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7bgj\" (UniqueName: \"kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj\") pod \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.371670 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key\") pod \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.371692 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph\") pod \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\" (UID: \"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5\") " Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.377076 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph" (OuterVolumeSpecName: "ceph") pod "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" (UID: "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.377319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj" (OuterVolumeSpecName: "kube-api-access-b7bgj") pod "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" (UID: "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5"). InnerVolumeSpecName "kube-api-access-b7bgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.397837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory" (OuterVolumeSpecName: "inventory") pod "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" (UID: "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.400576 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" (UID: "b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.474373 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7bgj\" (UniqueName: \"kubernetes.io/projected/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-kube-api-access-b7bgj\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.474409 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.474419 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.474427 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.913279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" event={"ID":"b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5","Type":"ContainerDied","Data":"4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56"} Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.913630 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aef9b8a2aab906120e97fdef2501fb99dadb60e627412456d0caab17e095f56" Dec 01 11:08:07 crc kubenswrapper[4909]: I1201 11:08:07.913571 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.071621 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc"] Dec 01 11:08:08 crc kubenswrapper[4909]: E1201 11:08:08.072063 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.072085 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.072271 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.072884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077071 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077148 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077537 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077758 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077808 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.077775 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.078278 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.083883 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.083976 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084196 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zdg\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084390 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.084625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.089286 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc"] Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187420 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187615 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187721 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.187792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188859 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.188975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zdg\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.192537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.192547 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.192619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.192707 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.194013 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.194273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.194954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.195065 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.198599 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.198697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.204504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.206332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.208319 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zdg\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.395391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:08 crc kubenswrapper[4909]: I1201 11:08:08.939711 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc"] Dec 01 11:08:09 crc kubenswrapper[4909]: I1201 11:08:09.936953 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" event={"ID":"365bd5e9-5b82-4c6d-b8cf-57a73146653e","Type":"ContainerStarted","Data":"aeb0dad684bd8f3cd7a0ca299a59142147049cea2cd157517f579ebb755d3298"} Dec 01 11:08:09 crc kubenswrapper[4909]: I1201 11:08:09.937644 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" event={"ID":"365bd5e9-5b82-4c6d-b8cf-57a73146653e","Type":"ContainerStarted","Data":"3c9e96693d7a492fd225fcc707e4a95befa824f20e3d1ee01db05766d664fa84"} Dec 01 11:08:09 crc kubenswrapper[4909]: I1201 11:08:09.962571 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" podStartSLOduration=1.452027104 podStartE2EDuration="1.962548397s" podCreationTimestamp="2025-12-01 11:08:08 +0000 UTC" firstStartedPulling="2025-12-01 11:08:08.94275106 +0000 UTC m=+2206.177221978" lastFinishedPulling="2025-12-01 11:08:09.453272373 +0000 UTC m=+2206.687743271" observedRunningTime="2025-12-01 11:08:09.954052088 +0000 UTC m=+2207.188522986" watchObservedRunningTime="2025-12-01 11:08:09.962548397 +0000 UTC m=+2207.197019295" Dec 01 11:08:20 crc kubenswrapper[4909]: I1201 11:08:20.256956 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:08:20 crc kubenswrapper[4909]: E1201 11:08:20.257645 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:31 crc kubenswrapper[4909]: I1201 11:08:31.257258 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:08:31 crc kubenswrapper[4909]: E1201 11:08:31.257943 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:41 crc kubenswrapper[4909]: I1201 11:08:41.230568 4909 generic.go:334] "Generic (PLEG): container finished" podID="365bd5e9-5b82-4c6d-b8cf-57a73146653e" containerID="aeb0dad684bd8f3cd7a0ca299a59142147049cea2cd157517f579ebb755d3298" exitCode=0 Dec 01 11:08:41 crc kubenswrapper[4909]: I1201 11:08:41.230697 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" event={"ID":"365bd5e9-5b82-4c6d-b8cf-57a73146653e","Type":"ContainerDied","Data":"aeb0dad684bd8f3cd7a0ca299a59142147049cea2cd157517f579ebb755d3298"} Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.651103 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.731343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.731478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.731581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.731606 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.731676 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732698 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732741 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732783 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zdg\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732820 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732915 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.732966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.733010 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle\") pod \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\" (UID: \"365bd5e9-5b82-4c6d-b8cf-57a73146653e\") " Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.738985 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.741587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph" (OuterVolumeSpecName: "ceph") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.741796 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.741822 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.742242 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.742624 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.742771 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.742971 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg" (OuterVolumeSpecName: "kube-api-access-q7zdg") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "kube-api-access-q7zdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.743322 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.744102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.746937 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.763534 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory" (OuterVolumeSpecName: "inventory") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.779784 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "365bd5e9-5b82-4c6d-b8cf-57a73146653e" (UID: "365bd5e9-5b82-4c6d-b8cf-57a73146653e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835308 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835356 4909 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835370 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835387 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835399 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835411 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835421 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zdg\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-kube-api-access-q7zdg\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835431 4909 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835439 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835450 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835458 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835467 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/365bd5e9-5b82-4c6d-b8cf-57a73146653e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:42 crc kubenswrapper[4909]: I1201 11:08:42.835477 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/365bd5e9-5b82-4c6d-b8cf-57a73146653e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.250434 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" event={"ID":"365bd5e9-5b82-4c6d-b8cf-57a73146653e","Type":"ContainerDied","Data":"3c9e96693d7a492fd225fcc707e4a95befa824f20e3d1ee01db05766d664fa84"} Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.250487 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9e96693d7a492fd225fcc707e4a95befa824f20e3d1ee01db05766d664fa84" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.250533 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.361418 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd"] Dec 01 11:08:43 crc kubenswrapper[4909]: E1201 11:08:43.361961 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="365bd5e9-5b82-4c6d-b8cf-57a73146653e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.361990 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="365bd5e9-5b82-4c6d-b8cf-57a73146653e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.362229 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="365bd5e9-5b82-4c6d-b8cf-57a73146653e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.363145 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.365747 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.367032 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.367227 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.369012 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.369377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.374795 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd"] Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.449766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.449823 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.449844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.449905 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmhk\" (UniqueName: \"kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.552097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.552174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.552200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.552253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmhk\" (UniqueName: \"kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.556803 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.556866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.557344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.568069 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmhk\" (UniqueName: \"kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:43 crc kubenswrapper[4909]: I1201 11:08:43.681908 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:44 crc kubenswrapper[4909]: I1201 11:08:44.175771 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd"] Dec 01 11:08:44 crc kubenswrapper[4909]: I1201 11:08:44.177213 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:08:44 crc kubenswrapper[4909]: I1201 11:08:44.256738 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:08:44 crc kubenswrapper[4909]: E1201 11:08:44.257086 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:44 crc kubenswrapper[4909]: I1201 11:08:44.259247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" event={"ID":"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1","Type":"ContainerStarted","Data":"074e5014f22cab664a52e0262eddcbf14e2df45e79636fb8224d24330dcfcdc7"} Dec 01 11:08:45 crc kubenswrapper[4909]: I1201 11:08:45.271184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" event={"ID":"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1","Type":"ContainerStarted","Data":"dfec365a3c6527a7baeaeb34d6421bfd47ca3884f5aacd17a9aea65ca20479d2"} Dec 01 11:08:45 crc kubenswrapper[4909]: I1201 11:08:45.300640 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" podStartSLOduration=1.766314973 podStartE2EDuration="2.300611848s" podCreationTimestamp="2025-12-01 11:08:43 +0000 UTC" firstStartedPulling="2025-12-01 11:08:44.176900116 +0000 UTC m=+2241.411371014" lastFinishedPulling="2025-12-01 11:08:44.711196991 +0000 UTC m=+2241.945667889" observedRunningTime="2025-12-01 11:08:45.290158918 +0000 UTC m=+2242.524629816" watchObservedRunningTime="2025-12-01 11:08:45.300611848 +0000 UTC m=+2242.535082756" Dec 01 11:08:51 crc kubenswrapper[4909]: I1201 11:08:51.323016 4909 generic.go:334] "Generic (PLEG): container finished" podID="0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" containerID="dfec365a3c6527a7baeaeb34d6421bfd47ca3884f5aacd17a9aea65ca20479d2" exitCode=0 Dec 01 11:08:51 crc kubenswrapper[4909]: I1201 11:08:51.323063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" event={"ID":"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1","Type":"ContainerDied","Data":"dfec365a3c6527a7baeaeb34d6421bfd47ca3884f5aacd17a9aea65ca20479d2"} Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.790631 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.924462 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory\") pod \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.924711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qmhk\" (UniqueName: \"kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk\") pod \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.924793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key\") pod \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.925080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph\") pod \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\" (UID: \"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1\") " Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.933997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk" (OuterVolumeSpecName: "kube-api-access-9qmhk") pod "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" (UID: "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1"). InnerVolumeSpecName "kube-api-access-9qmhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.937023 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph" (OuterVolumeSpecName: "ceph") pod "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" (UID: "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.962997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory" (OuterVolumeSpecName: "inventory") pod "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" (UID: "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:52 crc kubenswrapper[4909]: I1201 11:08:52.971399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" (UID: "0008cf3a-bf47-4886-a3a2-3b68c09d8ff1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.027125 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.027173 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.027187 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qmhk\" (UniqueName: \"kubernetes.io/projected/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-kube-api-access-9qmhk\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.027201 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0008cf3a-bf47-4886-a3a2-3b68c09d8ff1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.345238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" event={"ID":"0008cf3a-bf47-4886-a3a2-3b68c09d8ff1","Type":"ContainerDied","Data":"074e5014f22cab664a52e0262eddcbf14e2df45e79636fb8224d24330dcfcdc7"} Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.345568 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="074e5014f22cab664a52e0262eddcbf14e2df45e79636fb8224d24330dcfcdc7" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.345279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.418470 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g"] Dec 01 11:08:53 crc kubenswrapper[4909]: E1201 11:08:53.418955 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.418969 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.419159 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0008cf3a-bf47-4886-a3a2-3b68c09d8ff1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.419936 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.421801 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.423200 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.423205 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.423241 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.423423 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.423550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.426866 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g"] Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.535824 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.535907 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.535936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.536254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vz9j\" (UniqueName: \"kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.536330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.536379 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.638509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.638581 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.638726 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vz9j\" (UniqueName: \"kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.639189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.639243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.639362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.639711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.643614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.643616 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.644452 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.649592 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.655562 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vz9j\" (UniqueName: \"kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qfv6g\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:53 crc kubenswrapper[4909]: I1201 11:08:53.738503 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:08:54 crc kubenswrapper[4909]: I1201 11:08:54.263831 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g"] Dec 01 11:08:54 crc kubenswrapper[4909]: W1201 11:08:54.265987 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd4dda9_f078_4d82_bbf0_040ef5d994cb.slice/crio-ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae WatchSource:0}: Error finding container ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae: Status 404 returned error can't find the container with id ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae Dec 01 11:08:54 crc kubenswrapper[4909]: I1201 11:08:54.359502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" event={"ID":"8fd4dda9-f078-4d82-bbf0-040ef5d994cb","Type":"ContainerStarted","Data":"ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae"} Dec 01 11:08:55 crc kubenswrapper[4909]: I1201 11:08:55.257668 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:08:55 crc kubenswrapper[4909]: E1201 11:08:55.258112 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:08:55 crc kubenswrapper[4909]: I1201 11:08:55.368892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" event={"ID":"8fd4dda9-f078-4d82-bbf0-040ef5d994cb","Type":"ContainerStarted","Data":"4193660ca94f6992fd38ea2ed73473e6498dc7e9c613a8237461c9f8d3dd5ce6"} Dec 01 11:08:55 crc kubenswrapper[4909]: I1201 11:08:55.385063 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" podStartSLOduration=1.9145002039999999 podStartE2EDuration="2.385032563s" podCreationTimestamp="2025-12-01 11:08:53 +0000 UTC" firstStartedPulling="2025-12-01 11:08:54.268667453 +0000 UTC m=+2251.503138351" lastFinishedPulling="2025-12-01 11:08:54.739199812 +0000 UTC m=+2251.973670710" observedRunningTime="2025-12-01 11:08:55.384736764 +0000 UTC m=+2252.619207672" watchObservedRunningTime="2025-12-01 11:08:55.385032563 +0000 UTC m=+2252.619503461" Dec 01 11:09:06 crc kubenswrapper[4909]: I1201 11:09:06.258065 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:09:06 crc kubenswrapper[4909]: E1201 11:09:06.258847 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:09:20 crc kubenswrapper[4909]: I1201 11:09:20.256951 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:09:20 crc kubenswrapper[4909]: E1201 11:09:20.257683 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:09:31 crc kubenswrapper[4909]: I1201 11:09:31.261642 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:09:31 crc kubenswrapper[4909]: E1201 11:09:31.262640 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:09:43 crc kubenswrapper[4909]: I1201 11:09:43.263199 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:09:43 crc kubenswrapper[4909]: E1201 11:09:43.263989 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:09:54 crc kubenswrapper[4909]: I1201 11:09:54.259723 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:09:54 crc kubenswrapper[4909]: E1201 11:09:54.260858 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:01 crc kubenswrapper[4909]: I1201 11:10:01.957022 4909 generic.go:334] "Generic (PLEG): container finished" podID="8fd4dda9-f078-4d82-bbf0-040ef5d994cb" containerID="4193660ca94f6992fd38ea2ed73473e6498dc7e9c613a8237461c9f8d3dd5ce6" exitCode=0 Dec 01 11:10:01 crc kubenswrapper[4909]: I1201 11:10:01.957103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" event={"ID":"8fd4dda9-f078-4d82-bbf0-040ef5d994cb","Type":"ContainerDied","Data":"4193660ca94f6992fd38ea2ed73473e6498dc7e9c613a8237461c9f8d3dd5ce6"} Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.363172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507241 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507359 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507387 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507521 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vz9j\" (UniqueName: \"kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.507561 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle\") pod \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\" (UID: \"8fd4dda9-f078-4d82-bbf0-040ef5d994cb\") " Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.513066 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.513548 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j" (OuterVolumeSpecName: "kube-api-access-8vz9j") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "kube-api-access-8vz9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.513724 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph" (OuterVolumeSpecName: "ceph") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.533509 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.535811 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.537821 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory" (OuterVolumeSpecName: "inventory") pod "8fd4dda9-f078-4d82-bbf0-040ef5d994cb" (UID: "8fd4dda9-f078-4d82-bbf0-040ef5d994cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609819 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609864 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609892 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609906 4909 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609916 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vz9j\" (UniqueName: \"kubernetes.io/projected/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-kube-api-access-8vz9j\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.609926 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd4dda9-f078-4d82-bbf0-040ef5d994cb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.976906 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" event={"ID":"8fd4dda9-f078-4d82-bbf0-040ef5d994cb","Type":"ContainerDied","Data":"ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae"} Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.976975 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef118592099bb04a8336d2c693cfc45370629fd6bc160b07afb7ace153df5dae" Dec 01 11:10:03 crc kubenswrapper[4909]: I1201 11:10:03.977031 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qfv6g" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.053317 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5"] Dec 01 11:10:04 crc kubenswrapper[4909]: E1201 11:10:04.053695 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd4dda9-f078-4d82-bbf0-040ef5d994cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.053713 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd4dda9-f078-4d82-bbf0-040ef5d994cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.053888 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd4dda9-f078-4d82-bbf0-040ef5d994cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.054476 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.056586 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.056586 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.057659 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.057712 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.057815 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.057899 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.057999 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.065432 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5"] Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220037 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdss\" (UniqueName: \"kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220087 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220207 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.220280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.321934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.321985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.322031 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.322055 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.322111 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.322192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.322216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdss\" (UniqueName: \"kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.326783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.327020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.327750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.327849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.328507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.329135 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.345857 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdss\" (UniqueName: \"kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.371352 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.867269 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5"] Dec 01 11:10:04 crc kubenswrapper[4909]: I1201 11:10:04.985740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" event={"ID":"4c73174a-75a2-47d9-82f9-5b26ea497032","Type":"ContainerStarted","Data":"0390d9ab43fe1a3fe7fb2295e91b8790d31a79f7334a713b405445460bf8e0c5"} Dec 01 11:10:05 crc kubenswrapper[4909]: I1201 11:10:05.256762 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:10:05 crc kubenswrapper[4909]: E1201 11:10:05.257137 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:05 crc kubenswrapper[4909]: I1201 11:10:05.997392 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" event={"ID":"4c73174a-75a2-47d9-82f9-5b26ea497032","Type":"ContainerStarted","Data":"05bb06721c97c78b3063809dc49c8dd1b2458c136a705b32bac499155b82e430"} Dec 01 11:10:06 crc kubenswrapper[4909]: I1201 11:10:06.020446 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" podStartSLOduration=1.469666663 podStartE2EDuration="2.020424999s" podCreationTimestamp="2025-12-01 11:10:04 +0000 UTC" firstStartedPulling="2025-12-01 11:10:04.872930215 +0000 UTC m=+2322.107401113" lastFinishedPulling="2025-12-01 11:10:05.423688551 +0000 UTC m=+2322.658159449" observedRunningTime="2025-12-01 11:10:06.017077784 +0000 UTC m=+2323.251548702" watchObservedRunningTime="2025-12-01 11:10:06.020424999 +0000 UTC m=+2323.254895907" Dec 01 11:10:16 crc kubenswrapper[4909]: I1201 11:10:16.256709 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:10:16 crc kubenswrapper[4909]: E1201 11:10:16.257496 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:30 crc kubenswrapper[4909]: I1201 11:10:30.257092 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:10:30 crc kubenswrapper[4909]: E1201 11:10:30.259117 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:43 crc kubenswrapper[4909]: I1201 11:10:43.264084 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:10:43 crc kubenswrapper[4909]: E1201 11:10:43.264970 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:54 crc kubenswrapper[4909]: I1201 11:10:54.257867 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:10:54 crc kubenswrapper[4909]: E1201 11:10:54.258754 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:10:58 crc kubenswrapper[4909]: I1201 11:10:58.408045 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c73174a-75a2-47d9-82f9-5b26ea497032" containerID="05bb06721c97c78b3063809dc49c8dd1b2458c136a705b32bac499155b82e430" exitCode=0 Dec 01 11:10:58 crc kubenswrapper[4909]: I1201 11:10:58.408095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" event={"ID":"4c73174a-75a2-47d9-82f9-5b26ea497032","Type":"ContainerDied","Data":"05bb06721c97c78b3063809dc49c8dd1b2458c136a705b32bac499155b82e430"} Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.811417 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963555 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963667 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhdss\" (UniqueName: \"kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963837 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963894 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.963951 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph\") pod \"4c73174a-75a2-47d9-82f9-5b26ea497032\" (UID: \"4c73174a-75a2-47d9-82f9-5b26ea497032\") " Dec 01 11:10:59 crc kubenswrapper[4909]: I1201 11:10:59.991257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.006058 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph" (OuterVolumeSpecName: "ceph") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.018016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss" (OuterVolumeSpecName: "kube-api-access-jhdss") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "kube-api-access-jhdss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.030849 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.036066 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.039074 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.040975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory" (OuterVolumeSpecName: "inventory") pod "4c73174a-75a2-47d9-82f9-5b26ea497032" (UID: "4c73174a-75a2-47d9-82f9-5b26ea497032"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066627 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066676 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhdss\" (UniqueName: \"kubernetes.io/projected/4c73174a-75a2-47d9-82f9-5b26ea497032-kube-api-access-jhdss\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066689 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066705 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066720 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066732 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.066744 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c73174a-75a2-47d9-82f9-5b26ea497032-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.428493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" event={"ID":"4c73174a-75a2-47d9-82f9-5b26ea497032","Type":"ContainerDied","Data":"0390d9ab43fe1a3fe7fb2295e91b8790d31a79f7334a713b405445460bf8e0c5"} Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.428902 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0390d9ab43fe1a3fe7fb2295e91b8790d31a79f7334a713b405445460bf8e0c5" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.428583 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5" Dec 01 11:11:00 crc kubenswrapper[4909]: E1201 11:11:00.518156 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c73174a_75a2_47d9_82f9_5b26ea497032.slice/crio-0390d9ab43fe1a3fe7fb2295e91b8790d31a79f7334a713b405445460bf8e0c5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c73174a_75a2_47d9_82f9_5b26ea497032.slice\": RecentStats: unable to find data in memory cache]" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.568763 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x"] Dec 01 11:11:00 crc kubenswrapper[4909]: E1201 11:11:00.569260 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c73174a-75a2-47d9-82f9-5b26ea497032" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.569280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c73174a-75a2-47d9-82f9-5b26ea497032" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.569466 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c73174a-75a2-47d9-82f9-5b26ea497032" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.571112 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.574455 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.574522 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.575012 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.575401 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.575717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.580140 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.587406 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x"] Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.677763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.678080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.678185 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.678294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.678394 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.678505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf625\" (UniqueName: \"kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.780667 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf625\" (UniqueName: \"kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.781069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.781221 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.782997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.783138 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.783235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.785684 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.786159 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.786346 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.786546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.793139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.798322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf625\" (UniqueName: \"kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:00 crc kubenswrapper[4909]: I1201 11:11:00.890601 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:11:01 crc kubenswrapper[4909]: I1201 11:11:01.381351 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x"] Dec 01 11:11:01 crc kubenswrapper[4909]: I1201 11:11:01.437657 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" event={"ID":"447d20e1-014c-49bb-a3c9-9057b255a1ed","Type":"ContainerStarted","Data":"439448f8d5e29afaf203ddfa5608a1594a60f848ab9ab0a7ce90d2a2869a4271"} Dec 01 11:11:02 crc kubenswrapper[4909]: I1201 11:11:02.447444 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" event={"ID":"447d20e1-014c-49bb-a3c9-9057b255a1ed","Type":"ContainerStarted","Data":"3b57d8c678e5c1d82a017e916294731f1f42a9a39d0290b785e04905acadf507"} Dec 01 11:11:02 crc kubenswrapper[4909]: I1201 11:11:02.465081 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" podStartSLOduration=1.8455301149999999 podStartE2EDuration="2.465065365s" podCreationTimestamp="2025-12-01 11:11:00 +0000 UTC" firstStartedPulling="2025-12-01 11:11:01.387896964 +0000 UTC m=+2378.622367862" lastFinishedPulling="2025-12-01 11:11:02.007432214 +0000 UTC m=+2379.241903112" observedRunningTime="2025-12-01 11:11:02.462459912 +0000 UTC m=+2379.696930830" watchObservedRunningTime="2025-12-01 11:11:02.465065365 +0000 UTC m=+2379.699536253" Dec 01 11:11:07 crc kubenswrapper[4909]: I1201 11:11:07.262208 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:11:07 crc kubenswrapper[4909]: E1201 11:11:07.262895 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:11:19 crc kubenswrapper[4909]: I1201 11:11:19.259723 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:11:19 crc kubenswrapper[4909]: E1201 11:11:19.260418 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:11:33 crc kubenswrapper[4909]: I1201 11:11:33.263299 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:11:33 crc kubenswrapper[4909]: E1201 11:11:33.264090 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:11:48 crc kubenswrapper[4909]: I1201 11:11:48.258082 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:11:48 crc kubenswrapper[4909]: E1201 11:11:48.258976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:12:03 crc kubenswrapper[4909]: I1201 11:12:03.264217 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:12:03 crc kubenswrapper[4909]: E1201 11:12:03.308160 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:12:15 crc kubenswrapper[4909]: I1201 11:12:15.257026 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:12:15 crc kubenswrapper[4909]: E1201 11:12:15.259130 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:12:30 crc kubenswrapper[4909]: I1201 11:12:30.257344 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:12:30 crc kubenswrapper[4909]: E1201 11:12:30.258178 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:12:41 crc kubenswrapper[4909]: I1201 11:12:41.258092 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:12:41 crc kubenswrapper[4909]: E1201 11:12:41.259043 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:12:56 crc kubenswrapper[4909]: I1201 11:12:56.257101 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:12:56 crc kubenswrapper[4909]: E1201 11:12:56.257976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:13:07 crc kubenswrapper[4909]: I1201 11:13:07.257988 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:13:07 crc kubenswrapper[4909]: I1201 11:13:07.551825 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1"} Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.163380 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr"] Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.165555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.168780 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.169046 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.213259 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr"] Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.257665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.257710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42r4v\" (UniqueName: \"kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.258059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.359458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.359524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42r4v\" (UniqueName: \"kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.359637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.360612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.370495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.384781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42r4v\" (UniqueName: \"kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v\") pod \"collect-profiles-29409795-4hggr\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.489408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:00 crc kubenswrapper[4909]: I1201 11:15:00.931354 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr"] Dec 01 11:15:01 crc kubenswrapper[4909]: I1201 11:15:01.588854 4909 generic.go:334] "Generic (PLEG): container finished" podID="fea41a9e-ecee-46f7-90e5-dbba37333c85" containerID="acc862312cc08563db7eb2aa7c20676aebf070cbea2600453d6de44a61ab369d" exitCode=0 Dec 01 11:15:01 crc kubenswrapper[4909]: I1201 11:15:01.589322 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" event={"ID":"fea41a9e-ecee-46f7-90e5-dbba37333c85","Type":"ContainerDied","Data":"acc862312cc08563db7eb2aa7c20676aebf070cbea2600453d6de44a61ab369d"} Dec 01 11:15:01 crc kubenswrapper[4909]: I1201 11:15:01.589367 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" event={"ID":"fea41a9e-ecee-46f7-90e5-dbba37333c85","Type":"ContainerStarted","Data":"5096b57135210e8772be5ce56ede85a333eb1429d5f7a0bb96b66b233c958578"} Dec 01 11:15:02 crc kubenswrapper[4909]: I1201 11:15:02.907532 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.016245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume\") pod \"fea41a9e-ecee-46f7-90e5-dbba37333c85\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.016358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume\") pod \"fea41a9e-ecee-46f7-90e5-dbba37333c85\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.016445 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42r4v\" (UniqueName: \"kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v\") pod \"fea41a9e-ecee-46f7-90e5-dbba37333c85\" (UID: \"fea41a9e-ecee-46f7-90e5-dbba37333c85\") " Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.017221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume" (OuterVolumeSpecName: "config-volume") pod "fea41a9e-ecee-46f7-90e5-dbba37333c85" (UID: "fea41a9e-ecee-46f7-90e5-dbba37333c85"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.023262 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fea41a9e-ecee-46f7-90e5-dbba37333c85" (UID: "fea41a9e-ecee-46f7-90e5-dbba37333c85"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.023820 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v" (OuterVolumeSpecName: "kube-api-access-42r4v") pod "fea41a9e-ecee-46f7-90e5-dbba37333c85" (UID: "fea41a9e-ecee-46f7-90e5-dbba37333c85"). InnerVolumeSpecName "kube-api-access-42r4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.118464 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fea41a9e-ecee-46f7-90e5-dbba37333c85-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.118807 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fea41a9e-ecee-46f7-90e5-dbba37333c85-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.118928 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42r4v\" (UniqueName: \"kubernetes.io/projected/fea41a9e-ecee-46f7-90e5-dbba37333c85-kube-api-access-42r4v\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.608419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" event={"ID":"fea41a9e-ecee-46f7-90e5-dbba37333c85","Type":"ContainerDied","Data":"5096b57135210e8772be5ce56ede85a333eb1429d5f7a0bb96b66b233c958578"} Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.608993 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5096b57135210e8772be5ce56ede85a333eb1429d5f7a0bb96b66b233c958578" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.608480 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr" Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.980943 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h"] Dec 01 11:15:03 crc kubenswrapper[4909]: I1201 11:15:03.989040 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-j469h"] Dec 01 11:15:05 crc kubenswrapper[4909]: I1201 11:15:05.269690 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6190ef7-3deb-4bd9-9c73-109572e871d1" path="/var/lib/kubelet/pods/d6190ef7-3deb-4bd9-9c73-109572e871d1/volumes" Dec 01 11:15:36 crc kubenswrapper[4909]: I1201 11:15:36.193863 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:15:36 crc kubenswrapper[4909]: I1201 11:15:36.194535 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:15:42 crc kubenswrapper[4909]: I1201 11:15:42.824469 4909 scope.go:117] "RemoveContainer" containerID="a71c7aa8238a55a6461527a2dd182f2c7475657eb0842df727c47fdf1e8313a8" Dec 01 11:16:06 crc kubenswrapper[4909]: I1201 11:16:06.193838 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:16:06 crc kubenswrapper[4909]: I1201 11:16:06.194419 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.193517 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.194056 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.194106 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.194859 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.194938 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1" gracePeriod=600 Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.432450 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1" exitCode=0 Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.432816 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1"} Dec 01 11:16:36 crc kubenswrapper[4909]: I1201 11:16:36.432850 4909 scope.go:117] "RemoveContainer" containerID="10ff486b86ac063250cf6b90652624f62b2422dda276ca574310a0df0e7d7f08" Dec 01 11:16:37 crc kubenswrapper[4909]: I1201 11:16:37.443361 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19"} Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.683355 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:41 crc kubenswrapper[4909]: E1201 11:16:41.684460 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea41a9e-ecee-46f7-90e5-dbba37333c85" containerName="collect-profiles" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.684473 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea41a9e-ecee-46f7-90e5-dbba37333c85" containerName="collect-profiles" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.684648 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea41a9e-ecee-46f7-90e5-dbba37333c85" containerName="collect-profiles" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.685981 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.700270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.818831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.819221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.819329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf5d4\" (UniqueName: \"kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.921280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf5d4\" (UniqueName: \"kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.921408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.921552 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.922096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.922353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:41 crc kubenswrapper[4909]: I1201 11:16:41.942189 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf5d4\" (UniqueName: \"kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4\") pod \"redhat-marketplace-dxrsn\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:42 crc kubenswrapper[4909]: I1201 11:16:42.014841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:42 crc kubenswrapper[4909]: I1201 11:16:42.500669 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:43 crc kubenswrapper[4909]: I1201 11:16:43.499236 4909 generic.go:334] "Generic (PLEG): container finished" podID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerID="63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753" exitCode=0 Dec 01 11:16:43 crc kubenswrapper[4909]: I1201 11:16:43.499331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerDied","Data":"63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753"} Dec 01 11:16:43 crc kubenswrapper[4909]: I1201 11:16:43.499645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerStarted","Data":"7f294868e403125cb1542d9b7b153b22cd3eaa54c16e51931b7293bc2066cf8e"} Dec 01 11:16:43 crc kubenswrapper[4909]: I1201 11:16:43.501137 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:16:44 crc kubenswrapper[4909]: I1201 11:16:44.510788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerStarted","Data":"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973"} Dec 01 11:16:45 crc kubenswrapper[4909]: I1201 11:16:45.519902 4909 generic.go:334] "Generic (PLEG): container finished" podID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerID="fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973" exitCode=0 Dec 01 11:16:45 crc kubenswrapper[4909]: I1201 11:16:45.519945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerDied","Data":"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973"} Dec 01 11:16:46 crc kubenswrapper[4909]: I1201 11:16:46.534199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerStarted","Data":"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9"} Dec 01 11:16:46 crc kubenswrapper[4909]: I1201 11:16:46.573401 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dxrsn" podStartSLOduration=3.036861257 podStartE2EDuration="5.573381488s" podCreationTimestamp="2025-12-01 11:16:41 +0000 UTC" firstStartedPulling="2025-12-01 11:16:43.500912979 +0000 UTC m=+2720.735383877" lastFinishedPulling="2025-12-01 11:16:46.03743321 +0000 UTC m=+2723.271904108" observedRunningTime="2025-12-01 11:16:46.560471581 +0000 UTC m=+2723.794942489" watchObservedRunningTime="2025-12-01 11:16:46.573381488 +0000 UTC m=+2723.807852386" Dec 01 11:16:52 crc kubenswrapper[4909]: I1201 11:16:52.015315 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:52 crc kubenswrapper[4909]: I1201 11:16:52.015912 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:52 crc kubenswrapper[4909]: I1201 11:16:52.078728 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:52 crc kubenswrapper[4909]: I1201 11:16:52.639064 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:52 crc kubenswrapper[4909]: I1201 11:16:52.713288 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:54 crc kubenswrapper[4909]: I1201 11:16:54.599443 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dxrsn" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="registry-server" containerID="cri-o://8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9" gracePeriod=2 Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.050435 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.190019 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf5d4\" (UniqueName: \"kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4\") pod \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.190148 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities\") pod \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.190236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content\") pod \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\" (UID: \"21f440d8-e9a5-48c4-b0e2-5eef96a10528\") " Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.191529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities" (OuterVolumeSpecName: "utilities") pod "21f440d8-e9a5-48c4-b0e2-5eef96a10528" (UID: "21f440d8-e9a5-48c4-b0e2-5eef96a10528"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.196861 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4" (OuterVolumeSpecName: "kube-api-access-lf5d4") pod "21f440d8-e9a5-48c4-b0e2-5eef96a10528" (UID: "21f440d8-e9a5-48c4-b0e2-5eef96a10528"). InnerVolumeSpecName "kube-api-access-lf5d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.209042 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21f440d8-e9a5-48c4-b0e2-5eef96a10528" (UID: "21f440d8-e9a5-48c4-b0e2-5eef96a10528"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.292618 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf5d4\" (UniqueName: \"kubernetes.io/projected/21f440d8-e9a5-48c4-b0e2-5eef96a10528-kube-api-access-lf5d4\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.292815 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.292892 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f440d8-e9a5-48c4-b0e2-5eef96a10528-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.607777 4909 generic.go:334] "Generic (PLEG): container finished" podID="447d20e1-014c-49bb-a3c9-9057b255a1ed" containerID="3b57d8c678e5c1d82a017e916294731f1f42a9a39d0290b785e04905acadf507" exitCode=2 Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.607937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" event={"ID":"447d20e1-014c-49bb-a3c9-9057b255a1ed","Type":"ContainerDied","Data":"3b57d8c678e5c1d82a017e916294731f1f42a9a39d0290b785e04905acadf507"} Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.610769 4909 generic.go:334] "Generic (PLEG): container finished" podID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerID="8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9" exitCode=0 Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.610798 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerDied","Data":"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9"} Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.610815 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxrsn" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.610825 4909 scope.go:117] "RemoveContainer" containerID="8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.610816 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxrsn" event={"ID":"21f440d8-e9a5-48c4-b0e2-5eef96a10528","Type":"ContainerDied","Data":"7f294868e403125cb1542d9b7b153b22cd3eaa54c16e51931b7293bc2066cf8e"} Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.639364 4909 scope.go:117] "RemoveContainer" containerID="fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.652970 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.658654 4909 scope.go:117] "RemoveContainer" containerID="63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.660217 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxrsn"] Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.702175 4909 scope.go:117] "RemoveContainer" containerID="8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9" Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.702637 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9\": container with ID starting with 8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9 not found: ID does not exist" containerID="8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.702755 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9"} err="failed to get container status \"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9\": rpc error: code = NotFound desc = could not find container \"8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9\": container with ID starting with 8058606795e6e4d6600f2e834258ab2c40e4cacf191f9128db5805e7bdf7b3a9 not found: ID does not exist" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.702836 4909 scope.go:117] "RemoveContainer" containerID="fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973" Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.703254 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973\": container with ID starting with fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973 not found: ID does not exist" containerID="fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.703328 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973"} err="failed to get container status \"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973\": rpc error: code = NotFound desc = could not find container \"fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973\": container with ID starting with fbcfe485a22d7dba6ea341d26873aa712221f034acd2efc4b1d2985a06433973 not found: ID does not exist" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.703388 4909 scope.go:117] "RemoveContainer" containerID="63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753" Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.703747 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753\": container with ID starting with 63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753 not found: ID does not exist" containerID="63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.703823 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753"} err="failed to get container status \"63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753\": rpc error: code = NotFound desc = could not find container \"63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753\": container with ID starting with 63f8fc71cf4f6979d584bb99fa716551dd62147355c0b9a8b9005a3314181753 not found: ID does not exist" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.726430 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.727068 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="registry-server" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.727092 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="registry-server" Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.727110 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="extract-content" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.727118 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="extract-content" Dec 01 11:16:55 crc kubenswrapper[4909]: E1201 11:16:55.727147 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="extract-utilities" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.727155 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="extract-utilities" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.727385 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" containerName="registry-server" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.733020 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.739724 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.905161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hs7l\" (UniqueName: \"kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.905206 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:55 crc kubenswrapper[4909]: I1201 11:16:55.906253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.007923 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hs7l\" (UniqueName: \"kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.007969 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.008075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.008622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.008678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.028003 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hs7l\" (UniqueName: \"kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l\") pod \"certified-operators-fgfbc\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.096098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:16:56 crc kubenswrapper[4909]: I1201 11:16:56.654800 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.085424 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.236815 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.236917 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.237074 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf625\" (UniqueName: \"kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.237133 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.237216 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.237371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory\") pod \"447d20e1-014c-49bb-a3c9-9057b255a1ed\" (UID: \"447d20e1-014c-49bb-a3c9-9057b255a1ed\") " Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.244130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.244182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625" (OuterVolumeSpecName: "kube-api-access-tf625") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "kube-api-access-tf625". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.247172 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph" (OuterVolumeSpecName: "ceph") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.268505 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f440d8-e9a5-48c4-b0e2-5eef96a10528" path="/var/lib/kubelet/pods/21f440d8-e9a5-48c4-b0e2-5eef96a10528/volumes" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.271403 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory" (OuterVolumeSpecName: "inventory") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.273305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.275750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "447d20e1-014c-49bb-a3c9-9057b255a1ed" (UID: "447d20e1-014c-49bb-a3c9-9057b255a1ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340029 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340068 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340087 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf625\" (UniqueName: \"kubernetes.io/projected/447d20e1-014c-49bb-a3c9-9057b255a1ed-kube-api-access-tf625\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340099 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340215 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.340233 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d20e1-014c-49bb-a3c9-9057b255a1ed-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.632808 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" event={"ID":"447d20e1-014c-49bb-a3c9-9057b255a1ed","Type":"ContainerDied","Data":"439448f8d5e29afaf203ddfa5608a1594a60f848ab9ab0a7ce90d2a2869a4271"} Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.632835 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.632845 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439448f8d5e29afaf203ddfa5608a1594a60f848ab9ab0a7ce90d2a2869a4271" Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.637805 4909 generic.go:334] "Generic (PLEG): container finished" podID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerID="cf127cc4539a3ddb0e71990cbb6b174f36bb60dee5c164ebd368aa8206f86927" exitCode=0 Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.637847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerDied","Data":"cf127cc4539a3ddb0e71990cbb6b174f36bb60dee5c164ebd368aa8206f86927"} Dec 01 11:16:57 crc kubenswrapper[4909]: I1201 11:16:57.637892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerStarted","Data":"aff05bca52886d9a67bfd6d5dfb88ff74dfb4cf821e3604570865dd58210e2a7"} Dec 01 11:16:58 crc kubenswrapper[4909]: I1201 11:16:58.647771 4909 generic.go:334] "Generic (PLEG): container finished" podID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerID="762a8b704c82f3a050564308bd42459fea0b21038074924409426d5dbc07630c" exitCode=0 Dec 01 11:16:58 crc kubenswrapper[4909]: I1201 11:16:58.648155 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerDied","Data":"762a8b704c82f3a050564308bd42459fea0b21038074924409426d5dbc07630c"} Dec 01 11:16:59 crc kubenswrapper[4909]: I1201 11:16:59.657522 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerStarted","Data":"4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a"} Dec 01 11:16:59 crc kubenswrapper[4909]: I1201 11:16:59.675601 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fgfbc" podStartSLOduration=3.106499879 podStartE2EDuration="4.675585252s" podCreationTimestamp="2025-12-01 11:16:55 +0000 UTC" firstStartedPulling="2025-12-01 11:16:57.639451606 +0000 UTC m=+2734.873922494" lastFinishedPulling="2025-12-01 11:16:59.208536969 +0000 UTC m=+2736.443007867" observedRunningTime="2025-12-01 11:16:59.673181728 +0000 UTC m=+2736.907652646" watchObservedRunningTime="2025-12-01 11:16:59.675585252 +0000 UTC m=+2736.910056150" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.025251 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b"] Dec 01 11:17:04 crc kubenswrapper[4909]: E1201 11:17:04.026341 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447d20e1-014c-49bb-a3c9-9057b255a1ed" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.026360 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="447d20e1-014c-49bb-a3c9-9057b255a1ed" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.026589 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="447d20e1-014c-49bb-a3c9-9057b255a1ed" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.027382 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.029043 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.029410 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.035558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.035569 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.035767 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.036053 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.038514 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b"] Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066493 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slsqj\" (UniqueName: \"kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066559 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066607 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.066908 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168596 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slsqj\" (UniqueName: \"kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168691 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168726 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.168758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.174790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.174961 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.175170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.175225 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.176118 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.188668 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slsqj\" (UniqueName: \"kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.350010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:17:04 crc kubenswrapper[4909]: I1201 11:17:04.866393 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b"] Dec 01 11:17:05 crc kubenswrapper[4909]: I1201 11:17:05.705548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" event={"ID":"97aa464f-8d31-450b-a16a-4c6538c27bbb","Type":"ContainerStarted","Data":"c270d1fe3749d0047c93f1a6ef30ba3c4c48fe137f3bba4b71b597e913f37a9c"} Dec 01 11:17:05 crc kubenswrapper[4909]: I1201 11:17:05.705849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" event={"ID":"97aa464f-8d31-450b-a16a-4c6538c27bbb","Type":"ContainerStarted","Data":"b972c3988b43e107c62a9c7d584ab7641f180840cf9046b7e54df62ffcfb7992"} Dec 01 11:17:05 crc kubenswrapper[4909]: I1201 11:17:05.720634 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" podStartSLOduration=1.250262059 podStartE2EDuration="1.720620263s" podCreationTimestamp="2025-12-01 11:17:04 +0000 UTC" firstStartedPulling="2025-12-01 11:17:04.870154494 +0000 UTC m=+2742.104625402" lastFinishedPulling="2025-12-01 11:17:05.340512698 +0000 UTC m=+2742.574983606" observedRunningTime="2025-12-01 11:17:05.718834498 +0000 UTC m=+2742.953305396" watchObservedRunningTime="2025-12-01 11:17:05.720620263 +0000 UTC m=+2742.955091161" Dec 01 11:17:06 crc kubenswrapper[4909]: I1201 11:17:06.096427 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:06 crc kubenswrapper[4909]: I1201 11:17:06.096486 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:06 crc kubenswrapper[4909]: I1201 11:17:06.165789 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:06 crc kubenswrapper[4909]: I1201 11:17:06.760155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:06 crc kubenswrapper[4909]: I1201 11:17:06.808626 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:17:08 crc kubenswrapper[4909]: I1201 11:17:08.730480 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fgfbc" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="registry-server" containerID="cri-o://4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a" gracePeriod=2 Dec 01 11:17:08 crc kubenswrapper[4909]: E1201 11:17:08.889283 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23d3fbc_ccd5_436c_830a_b38b8eb88e6e.slice/crio-4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a.scope\": RecentStats: unable to find data in memory cache]" Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.739330 4909 generic.go:334] "Generic (PLEG): container finished" podID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerID="4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a" exitCode=0 Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.739418 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerDied","Data":"4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a"} Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.739665 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgfbc" event={"ID":"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e","Type":"ContainerDied","Data":"aff05bca52886d9a67bfd6d5dfb88ff74dfb4cf821e3604570865dd58210e2a7"} Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.739681 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aff05bca52886d9a67bfd6d5dfb88ff74dfb4cf821e3604570865dd58210e2a7" Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.755082 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.943724 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities\") pod \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.944211 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hs7l\" (UniqueName: \"kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l\") pod \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.944343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content\") pod \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\" (UID: \"f23d3fbc-ccd5-436c-830a-b38b8eb88e6e\") " Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.952571 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l" (OuterVolumeSpecName: "kube-api-access-9hs7l") pod "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" (UID: "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e"). InnerVolumeSpecName "kube-api-access-9hs7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:17:09 crc kubenswrapper[4909]: I1201 11:17:09.964931 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities" (OuterVolumeSpecName: "utilities") pod "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" (UID: "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.001856 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" (UID: "f23d3fbc-ccd5-436c-830a-b38b8eb88e6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.054104 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.054159 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hs7l\" (UniqueName: \"kubernetes.io/projected/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-kube-api-access-9hs7l\") on node \"crc\" DevicePath \"\"" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.054174 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.748603 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgfbc" Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.785348 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:17:10 crc kubenswrapper[4909]: I1201 11:17:10.795388 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fgfbc"] Dec 01 11:17:11 crc kubenswrapper[4909]: I1201 11:17:11.267100 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" path="/var/lib/kubelet/pods/f23d3fbc-ccd5-436c-830a-b38b8eb88e6e/volumes" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.952977 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:25 crc kubenswrapper[4909]: E1201 11:18:25.953846 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="extract-utilities" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.953859 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="extract-utilities" Dec 01 11:18:25 crc kubenswrapper[4909]: E1201 11:18:25.953894 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="registry-server" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.953901 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="registry-server" Dec 01 11:18:25 crc kubenswrapper[4909]: E1201 11:18:25.953910 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="extract-content" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.953916 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="extract-content" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.954129 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23d3fbc-ccd5-436c-830a-b38b8eb88e6e" containerName="registry-server" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.955528 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:25 crc kubenswrapper[4909]: I1201 11:18:25.969632 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.113280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.113547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.113711 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxw87\" (UniqueName: \"kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.215742 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxw87\" (UniqueName: \"kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.215901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.215927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.216570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.216956 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.237976 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxw87\" (UniqueName: \"kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87\") pod \"redhat-operators-n7rk7\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.294529 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:26 crc kubenswrapper[4909]: I1201 11:18:26.789648 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:27 crc kubenswrapper[4909]: I1201 11:18:27.444966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerDied","Data":"f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e"} Dec 01 11:18:27 crc kubenswrapper[4909]: I1201 11:18:27.444773 4909 generic.go:334] "Generic (PLEG): container finished" podID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerID="f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e" exitCode=0 Dec 01 11:18:27 crc kubenswrapper[4909]: I1201 11:18:27.446648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerStarted","Data":"f759d3d527108ad62944c7794001f25cdd3d9eb01c8d16cbfa62558bb4432522"} Dec 01 11:18:28 crc kubenswrapper[4909]: I1201 11:18:28.457780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerStarted","Data":"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b"} Dec 01 11:18:29 crc kubenswrapper[4909]: I1201 11:18:29.470049 4909 generic.go:334] "Generic (PLEG): container finished" podID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerID="9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b" exitCode=0 Dec 01 11:18:29 crc kubenswrapper[4909]: I1201 11:18:29.470144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerDied","Data":"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b"} Dec 01 11:18:31 crc kubenswrapper[4909]: I1201 11:18:31.490115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerStarted","Data":"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7"} Dec 01 11:18:31 crc kubenswrapper[4909]: I1201 11:18:31.516099 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n7rk7" podStartSLOduration=2.656925004 podStartE2EDuration="6.516020288s" podCreationTimestamp="2025-12-01 11:18:25 +0000 UTC" firstStartedPulling="2025-12-01 11:18:27.447073467 +0000 UTC m=+2824.681544365" lastFinishedPulling="2025-12-01 11:18:31.306168751 +0000 UTC m=+2828.540639649" observedRunningTime="2025-12-01 11:18:31.506770495 +0000 UTC m=+2828.741241413" watchObservedRunningTime="2025-12-01 11:18:31.516020288 +0000 UTC m=+2828.750491186" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.193356 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.194336 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.295586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.295658 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.355502 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.616253 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:36 crc kubenswrapper[4909]: I1201 11:18:36.666070 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:38 crc kubenswrapper[4909]: I1201 11:18:38.542309 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n7rk7" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="registry-server" containerID="cri-o://b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7" gracePeriod=2 Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.031266 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.172786 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxw87\" (UniqueName: \"kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87\") pod \"a3f2d197-6d1f-4798-82ae-a019d39c056f\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.173316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities\") pod \"a3f2d197-6d1f-4798-82ae-a019d39c056f\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.173410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content\") pod \"a3f2d197-6d1f-4798-82ae-a019d39c056f\" (UID: \"a3f2d197-6d1f-4798-82ae-a019d39c056f\") " Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.174011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities" (OuterVolumeSpecName: "utilities") pod "a3f2d197-6d1f-4798-82ae-a019d39c056f" (UID: "a3f2d197-6d1f-4798-82ae-a019d39c056f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.179140 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87" (OuterVolumeSpecName: "kube-api-access-lxw87") pod "a3f2d197-6d1f-4798-82ae-a019d39c056f" (UID: "a3f2d197-6d1f-4798-82ae-a019d39c056f"). InnerVolumeSpecName "kube-api-access-lxw87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.275628 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.275655 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxw87\" (UniqueName: \"kubernetes.io/projected/a3f2d197-6d1f-4798-82ae-a019d39c056f-kube-api-access-lxw87\") on node \"crc\" DevicePath \"\"" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.277296 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3f2d197-6d1f-4798-82ae-a019d39c056f" (UID: "a3f2d197-6d1f-4798-82ae-a019d39c056f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.377606 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3f2d197-6d1f-4798-82ae-a019d39c056f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.552743 4909 generic.go:334] "Generic (PLEG): container finished" podID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerID="b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7" exitCode=0 Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.552794 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerDied","Data":"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7"} Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.552846 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n7rk7" event={"ID":"a3f2d197-6d1f-4798-82ae-a019d39c056f","Type":"ContainerDied","Data":"f759d3d527108ad62944c7794001f25cdd3d9eb01c8d16cbfa62558bb4432522"} Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.552840 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n7rk7" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.552894 4909 scope.go:117] "RemoveContainer" containerID="b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.575087 4909 scope.go:117] "RemoveContainer" containerID="9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.586025 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.595275 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n7rk7"] Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.603619 4909 scope.go:117] "RemoveContainer" containerID="f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.636619 4909 scope.go:117] "RemoveContainer" containerID="b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7" Dec 01 11:18:39 crc kubenswrapper[4909]: E1201 11:18:39.637123 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7\": container with ID starting with b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7 not found: ID does not exist" containerID="b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.637164 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7"} err="failed to get container status \"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7\": rpc error: code = NotFound desc = could not find container \"b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7\": container with ID starting with b59bfb7cdeb77c5e9105b11b064b42d20b0fdc021f39458733e64a3dcd7446a7 not found: ID does not exist" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.637190 4909 scope.go:117] "RemoveContainer" containerID="9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b" Dec 01 11:18:39 crc kubenswrapper[4909]: E1201 11:18:39.637481 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b\": container with ID starting with 9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b not found: ID does not exist" containerID="9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.637504 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b"} err="failed to get container status \"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b\": rpc error: code = NotFound desc = could not find container \"9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b\": container with ID starting with 9764816cb8894c392e273c9370f7ae71ce932dcaa53d588a5200cc7f88427e0b not found: ID does not exist" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.637519 4909 scope.go:117] "RemoveContainer" containerID="f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e" Dec 01 11:18:39 crc kubenswrapper[4909]: E1201 11:18:39.637851 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e\": container with ID starting with f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e not found: ID does not exist" containerID="f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e" Dec 01 11:18:39 crc kubenswrapper[4909]: I1201 11:18:39.637886 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e"} err="failed to get container status \"f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e\": rpc error: code = NotFound desc = could not find container \"f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e\": container with ID starting with f3092ab1b88968392f1ca72eea8bf7964e8aaff8078299fc051450e04ee4468e not found: ID does not exist" Dec 01 11:18:41 crc kubenswrapper[4909]: I1201 11:18:41.268983 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" path="/var/lib/kubelet/pods/a3f2d197-6d1f-4798-82ae-a019d39c056f/volumes" Dec 01 11:19:06 crc kubenswrapper[4909]: I1201 11:19:06.194464 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:19:06 crc kubenswrapper[4909]: I1201 11:19:06.195124 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:19:36 crc kubenswrapper[4909]: I1201 11:19:36.194377 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:19:36 crc kubenswrapper[4909]: I1201 11:19:36.195524 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:19:36 crc kubenswrapper[4909]: I1201 11:19:36.195607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:19:36 crc kubenswrapper[4909]: I1201 11:19:36.197181 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:19:36 crc kubenswrapper[4909]: I1201 11:19:36.197273 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" gracePeriod=600 Dec 01 11:19:36 crc kubenswrapper[4909]: E1201 11:19:36.319979 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:19:37 crc kubenswrapper[4909]: I1201 11:19:37.083855 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" exitCode=0 Dec 01 11:19:37 crc kubenswrapper[4909]: I1201 11:19:37.083919 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19"} Dec 01 11:19:37 crc kubenswrapper[4909]: I1201 11:19:37.083986 4909 scope.go:117] "RemoveContainer" containerID="67c1dd645f41f242e024e22de275733572cd92a40ace8824f054e102207e7cb1" Dec 01 11:19:37 crc kubenswrapper[4909]: I1201 11:19:37.085287 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:19:37 crc kubenswrapper[4909]: E1201 11:19:37.085749 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:19:51 crc kubenswrapper[4909]: I1201 11:19:51.257766 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:19:51 crc kubenswrapper[4909]: E1201 11:19:51.258724 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:20:06 crc kubenswrapper[4909]: I1201 11:20:06.257139 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:20:06 crc kubenswrapper[4909]: E1201 11:20:06.257933 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:20:18 crc kubenswrapper[4909]: I1201 11:20:18.257677 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:20:18 crc kubenswrapper[4909]: E1201 11:20:18.258541 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:20:32 crc kubenswrapper[4909]: I1201 11:20:32.258054 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:20:32 crc kubenswrapper[4909]: E1201 11:20:32.258748 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:20:44 crc kubenswrapper[4909]: I1201 11:20:44.257292 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:20:44 crc kubenswrapper[4909]: E1201 11:20:44.258011 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:20:59 crc kubenswrapper[4909]: I1201 11:20:59.258408 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:20:59 crc kubenswrapper[4909]: E1201 11:20:59.259256 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:21:14 crc kubenswrapper[4909]: I1201 11:21:14.258748 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:21:14 crc kubenswrapper[4909]: E1201 11:21:14.262451 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:21:29 crc kubenswrapper[4909]: I1201 11:21:29.257745 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:21:29 crc kubenswrapper[4909]: E1201 11:21:29.258443 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:21:40 crc kubenswrapper[4909]: I1201 11:21:40.256641 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:21:40 crc kubenswrapper[4909]: E1201 11:21:40.257382 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.870846 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:21:49 crc kubenswrapper[4909]: E1201 11:21:49.871804 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="registry-server" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.871817 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="registry-server" Dec 01 11:21:49 crc kubenswrapper[4909]: E1201 11:21:49.871827 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="extract-utilities" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.871833 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="extract-utilities" Dec 01 11:21:49 crc kubenswrapper[4909]: E1201 11:21:49.871860 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="extract-content" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.871866 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="extract-content" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.872056 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f2d197-6d1f-4798-82ae-a019d39c056f" containerName="registry-server" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.873269 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.895373 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.989722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.990051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvj7\" (UniqueName: \"kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:49 crc kubenswrapper[4909]: I1201 11:21:49.990140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.092443 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvj7\" (UniqueName: \"kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.092772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.093032 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.093373 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.093429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.113494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvj7\" (UniqueName: \"kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7\") pod \"community-operators-5kkmq\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.193821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:21:50 crc kubenswrapper[4909]: I1201 11:21:50.748177 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:21:51 crc kubenswrapper[4909]: I1201 11:21:51.069708 4909 generic.go:334] "Generic (PLEG): container finished" podID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerID="4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e" exitCode=0 Dec 01 11:21:51 crc kubenswrapper[4909]: I1201 11:21:51.069781 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerDied","Data":"4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e"} Dec 01 11:21:51 crc kubenswrapper[4909]: I1201 11:21:51.070079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerStarted","Data":"a71636c5bbe1b3a296ad6b5a207ddf9c96c8a710254ab46538bfa79f243d0400"} Dec 01 11:21:51 crc kubenswrapper[4909]: I1201 11:21:51.072146 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:21:51 crc kubenswrapper[4909]: I1201 11:21:51.258178 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:21:51 crc kubenswrapper[4909]: E1201 11:21:51.258474 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:21:52 crc kubenswrapper[4909]: I1201 11:21:52.081114 4909 generic.go:334] "Generic (PLEG): container finished" podID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerID="277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37" exitCode=0 Dec 01 11:21:52 crc kubenswrapper[4909]: I1201 11:21:52.081324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerDied","Data":"277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37"} Dec 01 11:21:53 crc kubenswrapper[4909]: I1201 11:21:53.094648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerStarted","Data":"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de"} Dec 01 11:22:00 crc kubenswrapper[4909]: I1201 11:22:00.194236 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:00 crc kubenswrapper[4909]: I1201 11:22:00.194826 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:00 crc kubenswrapper[4909]: I1201 11:22:00.240662 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:00 crc kubenswrapper[4909]: I1201 11:22:00.265837 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5kkmq" podStartSLOduration=9.458068251 podStartE2EDuration="11.265818917s" podCreationTimestamp="2025-12-01 11:21:49 +0000 UTC" firstStartedPulling="2025-12-01 11:21:51.071922462 +0000 UTC m=+3028.306393370" lastFinishedPulling="2025-12-01 11:21:52.879673128 +0000 UTC m=+3030.114144036" observedRunningTime="2025-12-01 11:21:53.120732236 +0000 UTC m=+3030.355203134" watchObservedRunningTime="2025-12-01 11:22:00.265818917 +0000 UTC m=+3037.500289815" Dec 01 11:22:01 crc kubenswrapper[4909]: I1201 11:22:01.209613 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:01 crc kubenswrapper[4909]: I1201 11:22:01.268506 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.180336 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5kkmq" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="registry-server" containerID="cri-o://7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de" gracePeriod=2 Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.620396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.793769 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content\") pod \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.794328 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities\") pod \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.794473 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdvj7\" (UniqueName: \"kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7\") pod \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\" (UID: \"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e\") " Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.795357 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities" (OuterVolumeSpecName: "utilities") pod "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" (UID: "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.800467 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7" (OuterVolumeSpecName: "kube-api-access-kdvj7") pod "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" (UID: "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e"). InnerVolumeSpecName "kube-api-access-kdvj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.849529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" (UID: "99c4b26f-df0e-456a-ba3a-f9a5c9bd183e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.896346 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.896379 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdvj7\" (UniqueName: \"kubernetes.io/projected/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-kube-api-access-kdvj7\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:03 crc kubenswrapper[4909]: I1201 11:22:03.896390 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.193030 4909 generic.go:334] "Generic (PLEG): container finished" podID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerID="7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de" exitCode=0 Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.193084 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kkmq" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.193108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerDied","Data":"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de"} Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.193610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kkmq" event={"ID":"99c4b26f-df0e-456a-ba3a-f9a5c9bd183e","Type":"ContainerDied","Data":"a71636c5bbe1b3a296ad6b5a207ddf9c96c8a710254ab46538bfa79f243d0400"} Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.193649 4909 scope.go:117] "RemoveContainer" containerID="7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.219210 4909 scope.go:117] "RemoveContainer" containerID="277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.228869 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.239258 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5kkmq"] Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.257382 4909 scope.go:117] "RemoveContainer" containerID="4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.281471 4909 scope.go:117] "RemoveContainer" containerID="7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de" Dec 01 11:22:04 crc kubenswrapper[4909]: E1201 11:22:04.281963 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de\": container with ID starting with 7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de not found: ID does not exist" containerID="7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.281997 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de"} err="failed to get container status \"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de\": rpc error: code = NotFound desc = could not find container \"7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de\": container with ID starting with 7297cf8f9752c6b205a610cd252a46c016e8fe9ce97b13c6daadc23e592cd1de not found: ID does not exist" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.282020 4909 scope.go:117] "RemoveContainer" containerID="277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37" Dec 01 11:22:04 crc kubenswrapper[4909]: E1201 11:22:04.282230 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37\": container with ID starting with 277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37 not found: ID does not exist" containerID="277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.282251 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37"} err="failed to get container status \"277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37\": rpc error: code = NotFound desc = could not find container \"277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37\": container with ID starting with 277971832e37a37c31ace20f31f02fabb48b313cfc25886b80f41af359850d37 not found: ID does not exist" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.282264 4909 scope.go:117] "RemoveContainer" containerID="4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e" Dec 01 11:22:04 crc kubenswrapper[4909]: E1201 11:22:04.282513 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e\": container with ID starting with 4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e not found: ID does not exist" containerID="4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e" Dec 01 11:22:04 crc kubenswrapper[4909]: I1201 11:22:04.282535 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e"} err="failed to get container status \"4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e\": rpc error: code = NotFound desc = could not find container \"4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e\": container with ID starting with 4d1081ce0ca170fe1f3b95981be8991c0a9b615a1decf4e1e200ff13199d020e not found: ID does not exist" Dec 01 11:22:05 crc kubenswrapper[4909]: I1201 11:22:05.256834 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:22:05 crc kubenswrapper[4909]: E1201 11:22:05.257197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:22:05 crc kubenswrapper[4909]: I1201 11:22:05.266071 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" path="/var/lib/kubelet/pods/99c4b26f-df0e-456a-ba3a-f9a5c9bd183e/volumes" Dec 01 11:22:17 crc kubenswrapper[4909]: I1201 11:22:17.257501 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:22:17 crc kubenswrapper[4909]: E1201 11:22:17.258310 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:22:28 crc kubenswrapper[4909]: I1201 11:22:28.258102 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:22:28 crc kubenswrapper[4909]: E1201 11:22:28.259087 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:22:40 crc kubenswrapper[4909]: I1201 11:22:40.257166 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:22:40 crc kubenswrapper[4909]: E1201 11:22:40.258179 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:22:51 crc kubenswrapper[4909]: I1201 11:22:51.271163 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:22:51 crc kubenswrapper[4909]: E1201 11:22:51.272273 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:22:54 crc kubenswrapper[4909]: I1201 11:22:54.612748 4909 generic.go:334] "Generic (PLEG): container finished" podID="97aa464f-8d31-450b-a16a-4c6538c27bbb" containerID="c270d1fe3749d0047c93f1a6ef30ba3c4c48fe137f3bba4b71b597e913f37a9c" exitCode=2 Dec 01 11:22:54 crc kubenswrapper[4909]: I1201 11:22:54.612918 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" event={"ID":"97aa464f-8d31-450b-a16a-4c6538c27bbb","Type":"ContainerDied","Data":"c270d1fe3749d0047c93f1a6ef30ba3c4c48fe137f3bba4b71b597e913f37a9c"} Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.018779 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210139 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210586 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slsqj\" (UniqueName: \"kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210791 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.210941 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle\") pod \"97aa464f-8d31-450b-a16a-4c6538c27bbb\" (UID: \"97aa464f-8d31-450b-a16a-4c6538c27bbb\") " Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.218575 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph" (OuterVolumeSpecName: "ceph") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.219588 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj" (OuterVolumeSpecName: "kube-api-access-slsqj") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "kube-api-access-slsqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.219834 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.242734 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory" (OuterVolumeSpecName: "inventory") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.248364 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.249938 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "97aa464f-8d31-450b-a16a-4c6538c27bbb" (UID: "97aa464f-8d31-450b-a16a-4c6538c27bbb"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314053 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314361 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314433 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314494 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314561 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slsqj\" (UniqueName: \"kubernetes.io/projected/97aa464f-8d31-450b-a16a-4c6538c27bbb-kube-api-access-slsqj\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.314620 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97aa464f-8d31-450b-a16a-4c6538c27bbb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.634061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" event={"ID":"97aa464f-8d31-450b-a16a-4c6538c27bbb","Type":"ContainerDied","Data":"b972c3988b43e107c62a9c7d584ab7641f180840cf9046b7e54df62ffcfb7992"} Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.634450 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b972c3988b43e107c62a9c7d584ab7641f180840cf9046b7e54df62ffcfb7992" Dec 01 11:22:56 crc kubenswrapper[4909]: I1201 11:22:56.634140 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b" Dec 01 11:22:56 crc kubenswrapper[4909]: E1201 11:22:56.823140 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97aa464f_8d31_450b_a16a_4c6538c27bbb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97aa464f_8d31_450b_a16a_4c6538c27bbb.slice/crio-b972c3988b43e107c62a9c7d584ab7641f180840cf9046b7e54df62ffcfb7992\": RecentStats: unable to find data in memory cache]" Dec 01 11:23:03 crc kubenswrapper[4909]: I1201 11:23:03.266993 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:23:03 crc kubenswrapper[4909]: E1201 11:23:03.268203 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.037534 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn"] Dec 01 11:23:13 crc kubenswrapper[4909]: E1201 11:23:13.038748 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="extract-content" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.038772 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="extract-content" Dec 01 11:23:13 crc kubenswrapper[4909]: E1201 11:23:13.038804 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="extract-utilities" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.038816 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="extract-utilities" Dec 01 11:23:13 crc kubenswrapper[4909]: E1201 11:23:13.038842 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="registry-server" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.038857 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="registry-server" Dec 01 11:23:13 crc kubenswrapper[4909]: E1201 11:23:13.038911 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97aa464f-8d31-450b-a16a-4c6538c27bbb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.038925 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="97aa464f-8d31-450b-a16a-4c6538c27bbb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.039254 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="97aa464f-8d31-450b-a16a-4c6538c27bbb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.039315 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c4b26f-df0e-456a-ba3a-f9a5c9bd183e" containerName="registry-server" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.040247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.043235 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.043524 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.043645 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.043800 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.043950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.044139 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.048466 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn"] Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.092292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z422\" (UniqueName: \"kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.092627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.092883 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.093003 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.093048 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.093078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.195507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.195573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.195625 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.195678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.195821 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.196815 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z422\" (UniqueName: \"kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.202087 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.203104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.203220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.204604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.216247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.220885 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z422\" (UniqueName: \"kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zshgn\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.384701 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:23:13 crc kubenswrapper[4909]: I1201 11:23:13.865448 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn"] Dec 01 11:23:13 crc kubenswrapper[4909]: W1201 11:23:13.866082 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfde0fbe6_aa69_45a1_ac7e_a1db200b909a.slice/crio-52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8 WatchSource:0}: Error finding container 52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8: Status 404 returned error can't find the container with id 52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8 Dec 01 11:23:14 crc kubenswrapper[4909]: I1201 11:23:14.256936 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:23:14 crc kubenswrapper[4909]: E1201 11:23:14.257412 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:23:14 crc kubenswrapper[4909]: I1201 11:23:14.801029 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" event={"ID":"fde0fbe6-aa69-45a1-ac7e-a1db200b909a","Type":"ContainerStarted","Data":"52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8"} Dec 01 11:23:15 crc kubenswrapper[4909]: I1201 11:23:15.809151 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" event={"ID":"fde0fbe6-aa69-45a1-ac7e-a1db200b909a","Type":"ContainerStarted","Data":"40e72fa6c08bd2fa222c4bd92604ea945239efcff70f66a08db3cf49d14e7cc8"} Dec 01 11:23:15 crc kubenswrapper[4909]: I1201 11:23:15.828165 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" podStartSLOduration=1.4433923960000001 podStartE2EDuration="2.828146655s" podCreationTimestamp="2025-12-01 11:23:13 +0000 UTC" firstStartedPulling="2025-12-01 11:23:13.868375035 +0000 UTC m=+3111.102845933" lastFinishedPulling="2025-12-01 11:23:15.253129294 +0000 UTC m=+3112.487600192" observedRunningTime="2025-12-01 11:23:15.825070798 +0000 UTC m=+3113.059541706" watchObservedRunningTime="2025-12-01 11:23:15.828146655 +0000 UTC m=+3113.062617553" Dec 01 11:23:28 crc kubenswrapper[4909]: I1201 11:23:28.259185 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:23:28 crc kubenswrapper[4909]: E1201 11:23:28.259970 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:23:39 crc kubenswrapper[4909]: I1201 11:23:39.257286 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:23:39 crc kubenswrapper[4909]: E1201 11:23:39.258176 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:23:43 crc kubenswrapper[4909]: I1201 11:23:43.063225 4909 scope.go:117] "RemoveContainer" containerID="4b7e08a164349648f441fa35c2c0fc8068d7fb22e2ac772fe0b01b9370b82b1a" Dec 01 11:23:43 crc kubenswrapper[4909]: I1201 11:23:43.087662 4909 scope.go:117] "RemoveContainer" containerID="762a8b704c82f3a050564308bd42459fea0b21038074924409426d5dbc07630c" Dec 01 11:23:43 crc kubenswrapper[4909]: I1201 11:23:43.110243 4909 scope.go:117] "RemoveContainer" containerID="cf127cc4539a3ddb0e71990cbb6b174f36bb60dee5c164ebd368aa8206f86927" Dec 01 11:23:54 crc kubenswrapper[4909]: I1201 11:23:54.257229 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:23:54 crc kubenswrapper[4909]: E1201 11:23:54.257864 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:24:09 crc kubenswrapper[4909]: I1201 11:24:09.257175 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:24:09 crc kubenswrapper[4909]: E1201 11:24:09.257949 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:24:24 crc kubenswrapper[4909]: I1201 11:24:24.258590 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:24:24 crc kubenswrapper[4909]: E1201 11:24:24.259489 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:24:37 crc kubenswrapper[4909]: I1201 11:24:37.260756 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:24:37 crc kubenswrapper[4909]: I1201 11:24:37.514561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d"} Dec 01 11:27:06 crc kubenswrapper[4909]: I1201 11:27:06.193984 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:27:06 crc kubenswrapper[4909]: I1201 11:27:06.194976 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.131750 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.134087 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.176918 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.180941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.181141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpts5\" (UniqueName: \"kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.181170 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.282783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.282950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpts5\" (UniqueName: \"kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.282980 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.283451 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.283482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.305428 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpts5\" (UniqueName: \"kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5\") pod \"certified-operators-s5ktw\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:30 crc kubenswrapper[4909]: I1201 11:27:30.490543 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:31 crc kubenswrapper[4909]: I1201 11:27:31.023861 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:31 crc kubenswrapper[4909]: W1201 11:27:31.028037 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod465d01dc_b3e1_4213_adf0_090b22a79022.slice/crio-769ed2ca36454b05cb37753048aedbe16ba7e776328da4a04795b24b1fb72820 WatchSource:0}: Error finding container 769ed2ca36454b05cb37753048aedbe16ba7e776328da4a04795b24b1fb72820: Status 404 returned error can't find the container with id 769ed2ca36454b05cb37753048aedbe16ba7e776328da4a04795b24b1fb72820 Dec 01 11:27:31 crc kubenswrapper[4909]: I1201 11:27:31.393013 4909 generic.go:334] "Generic (PLEG): container finished" podID="465d01dc-b3e1-4213-adf0-090b22a79022" containerID="8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71" exitCode=0 Dec 01 11:27:31 crc kubenswrapper[4909]: I1201 11:27:31.393087 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerDied","Data":"8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71"} Dec 01 11:27:31 crc kubenswrapper[4909]: I1201 11:27:31.393566 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerStarted","Data":"769ed2ca36454b05cb37753048aedbe16ba7e776328da4a04795b24b1fb72820"} Dec 01 11:27:31 crc kubenswrapper[4909]: I1201 11:27:31.396054 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:27:33 crc kubenswrapper[4909]: I1201 11:27:33.410861 4909 generic.go:334] "Generic (PLEG): container finished" podID="465d01dc-b3e1-4213-adf0-090b22a79022" containerID="25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007" exitCode=0 Dec 01 11:27:33 crc kubenswrapper[4909]: I1201 11:27:33.410964 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerDied","Data":"25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007"} Dec 01 11:27:34 crc kubenswrapper[4909]: I1201 11:27:34.423174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerStarted","Data":"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb"} Dec 01 11:27:34 crc kubenswrapper[4909]: I1201 11:27:34.460249 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s5ktw" podStartSLOduration=2.039159963 podStartE2EDuration="4.460230851s" podCreationTimestamp="2025-12-01 11:27:30 +0000 UTC" firstStartedPulling="2025-12-01 11:27:31.395358959 +0000 UTC m=+3368.629829897" lastFinishedPulling="2025-12-01 11:27:33.816429877 +0000 UTC m=+3371.050900785" observedRunningTime="2025-12-01 11:27:34.45482332 +0000 UTC m=+3371.689294218" watchObservedRunningTime="2025-12-01 11:27:34.460230851 +0000 UTC m=+3371.694701749" Dec 01 11:27:36 crc kubenswrapper[4909]: I1201 11:27:36.193677 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:27:36 crc kubenswrapper[4909]: I1201 11:27:36.194012 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:27:40 crc kubenswrapper[4909]: I1201 11:27:40.491188 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:40 crc kubenswrapper[4909]: I1201 11:27:40.491532 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:40 crc kubenswrapper[4909]: I1201 11:27:40.543865 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:41 crc kubenswrapper[4909]: I1201 11:27:41.524258 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:41 crc kubenswrapper[4909]: I1201 11:27:41.574006 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:43 crc kubenswrapper[4909]: I1201 11:27:43.503927 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s5ktw" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="registry-server" containerID="cri-o://fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb" gracePeriod=2 Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.436943 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.514923 4909 generic.go:334] "Generic (PLEG): container finished" podID="465d01dc-b3e1-4213-adf0-090b22a79022" containerID="fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb" exitCode=0 Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.515019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerDied","Data":"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb"} Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.515001 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s5ktw" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.515150 4909 scope.go:117] "RemoveContainer" containerID="fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.515138 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s5ktw" event={"ID":"465d01dc-b3e1-4213-adf0-090b22a79022","Type":"ContainerDied","Data":"769ed2ca36454b05cb37753048aedbe16ba7e776328da4a04795b24b1fb72820"} Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.534642 4909 scope.go:117] "RemoveContainer" containerID="25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.555187 4909 scope.go:117] "RemoveContainer" containerID="8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.591354 4909 scope.go:117] "RemoveContainer" containerID="fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb" Dec 01 11:27:44 crc kubenswrapper[4909]: E1201 11:27:44.591762 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb\": container with ID starting with fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb not found: ID does not exist" containerID="fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.591807 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb"} err="failed to get container status \"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb\": rpc error: code = NotFound desc = could not find container \"fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb\": container with ID starting with fd0a2cd28faafca6091e199d4a695bcd71cf70e0fdc6a173a0539d8d9f90b2cb not found: ID does not exist" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.591836 4909 scope.go:117] "RemoveContainer" containerID="25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007" Dec 01 11:27:44 crc kubenswrapper[4909]: E1201 11:27:44.592167 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007\": container with ID starting with 25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007 not found: ID does not exist" containerID="25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.592200 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007"} err="failed to get container status \"25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007\": rpc error: code = NotFound desc = could not find container \"25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007\": container with ID starting with 25355f2ffcdfa4cffb4a1f79895e5a9e54ef9d82ccdc633a82e213494b143007 not found: ID does not exist" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.592225 4909 scope.go:117] "RemoveContainer" containerID="8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71" Dec 01 11:27:44 crc kubenswrapper[4909]: E1201 11:27:44.592431 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71\": container with ID starting with 8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71 not found: ID does not exist" containerID="8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.592460 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71"} err="failed to get container status \"8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71\": rpc error: code = NotFound desc = could not find container \"8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71\": container with ID starting with 8151f84817205ca7c26c91cb5632c097cfee467f5c7fc0de7148e3194aa71a71 not found: ID does not exist" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.640089 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities\") pod \"465d01dc-b3e1-4213-adf0-090b22a79022\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.640202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpts5\" (UniqueName: \"kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5\") pod \"465d01dc-b3e1-4213-adf0-090b22a79022\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.640325 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content\") pod \"465d01dc-b3e1-4213-adf0-090b22a79022\" (UID: \"465d01dc-b3e1-4213-adf0-090b22a79022\") " Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.641461 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities" (OuterVolumeSpecName: "utilities") pod "465d01dc-b3e1-4213-adf0-090b22a79022" (UID: "465d01dc-b3e1-4213-adf0-090b22a79022"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.647301 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5" (OuterVolumeSpecName: "kube-api-access-fpts5") pod "465d01dc-b3e1-4213-adf0-090b22a79022" (UID: "465d01dc-b3e1-4213-adf0-090b22a79022"). InnerVolumeSpecName "kube-api-access-fpts5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.685309 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "465d01dc-b3e1-4213-adf0-090b22a79022" (UID: "465d01dc-b3e1-4213-adf0-090b22a79022"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.742748 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.742787 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpts5\" (UniqueName: \"kubernetes.io/projected/465d01dc-b3e1-4213-adf0-090b22a79022-kube-api-access-fpts5\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.742801 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465d01dc-b3e1-4213-adf0-090b22a79022-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.851647 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:44 crc kubenswrapper[4909]: I1201 11:27:44.860252 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s5ktw"] Dec 01 11:27:45 crc kubenswrapper[4909]: I1201 11:27:45.266723 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" path="/var/lib/kubelet/pods/465d01dc-b3e1-4213-adf0-090b22a79022/volumes" Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.193753 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.194439 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.194505 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.195479 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.195591 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d" gracePeriod=600 Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.718845 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d" exitCode=0 Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.718997 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d"} Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.719501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406"} Dec 01 11:28:06 crc kubenswrapper[4909]: I1201 11:28:06.719527 4909 scope.go:117] "RemoveContainer" containerID="f2560afad31875fd8d2d020a3508d784c641f7bdb0b8e23475c5a75dd2c55d19" Dec 01 11:29:04 crc kubenswrapper[4909]: I1201 11:29:04.226601 4909 generic.go:334] "Generic (PLEG): container finished" podID="fde0fbe6-aa69-45a1-ac7e-a1db200b909a" containerID="40e72fa6c08bd2fa222c4bd92604ea945239efcff70f66a08db3cf49d14e7cc8" exitCode=2 Dec 01 11:29:04 crc kubenswrapper[4909]: I1201 11:29:04.226676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" event={"ID":"fde0fbe6-aa69-45a1-ac7e-a1db200b909a","Type":"ContainerDied","Data":"40e72fa6c08bd2fa222c4bd92604ea945239efcff70f66a08db3cf49d14e7cc8"} Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.657687 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.832415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.832765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.832840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.832939 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.833046 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z422\" (UniqueName: \"kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.833070 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph\") pod \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\" (UID: \"fde0fbe6-aa69-45a1-ac7e-a1db200b909a\") " Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.841726 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.841821 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422" (OuterVolumeSpecName: "kube-api-access-6z422") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "kube-api-access-6z422". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.844939 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph" (OuterVolumeSpecName: "ceph") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.864557 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory" (OuterVolumeSpecName: "inventory") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.868523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.875269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fde0fbe6-aa69-45a1-ac7e-a1db200b909a" (UID: "fde0fbe6-aa69-45a1-ac7e-a1db200b909a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935753 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935792 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935812 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935826 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935844 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z422\" (UniqueName: \"kubernetes.io/projected/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-kube-api-access-6z422\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:05 crc kubenswrapper[4909]: I1201 11:29:05.935857 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fde0fbe6-aa69-45a1-ac7e-a1db200b909a-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:06 crc kubenswrapper[4909]: I1201 11:29:06.245152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" event={"ID":"fde0fbe6-aa69-45a1-ac7e-a1db200b909a","Type":"ContainerDied","Data":"52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8"} Dec 01 11:29:06 crc kubenswrapper[4909]: I1201 11:29:06.245200 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52df201b4e4f31bbd8ee5938e8e72ede4913d776949a6b9fa8685e60718d19a8" Dec 01 11:29:06 crc kubenswrapper[4909]: I1201 11:29:06.245220 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zshgn" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.031586 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp"] Dec 01 11:29:43 crc kubenswrapper[4909]: E1201 11:29:43.032423 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="registry-server" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032438 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="registry-server" Dec 01 11:29:43 crc kubenswrapper[4909]: E1201 11:29:43.032449 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="extract-content" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032457 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="extract-content" Dec 01 11:29:43 crc kubenswrapper[4909]: E1201 11:29:43.032468 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde0fbe6-aa69-45a1-ac7e-a1db200b909a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032478 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde0fbe6-aa69-45a1-ac7e-a1db200b909a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:29:43 crc kubenswrapper[4909]: E1201 11:29:43.032498 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="extract-utilities" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032506 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="extract-utilities" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032719 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde0fbe6-aa69-45a1-ac7e-a1db200b909a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.032748 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="465d01dc-b3e1-4213-adf0-090b22a79022" containerName="registry-server" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.033465 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.035676 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.036046 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.036178 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.036340 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.036927 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.039113 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.051048 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp"] Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.097098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.097797 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.098053 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.098121 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.098265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsts4\" (UniqueName: \"kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.098301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.200638 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.200740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.200779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.200833 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsts4\" (UniqueName: \"kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.200868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.201008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.207904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.210728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.210846 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.210906 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.211838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.216929 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsts4\" (UniqueName: \"kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-v9khp\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.368697 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:29:43 crc kubenswrapper[4909]: I1201 11:29:43.861357 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp"] Dec 01 11:29:44 crc kubenswrapper[4909]: I1201 11:29:44.556779 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" event={"ID":"6e92c763-dfb0-42ce-ae14-d0196b547985","Type":"ContainerStarted","Data":"d6a6b9df1044fdc3ee25459ced18984be38c86d9ea5d201dcf4050368514fac0"} Dec 01 11:29:45 crc kubenswrapper[4909]: I1201 11:29:45.567575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" event={"ID":"6e92c763-dfb0-42ce-ae14-d0196b547985","Type":"ContainerStarted","Data":"cb5c1f26cffd368d35ec4bc438481870bf202530e52bc4bfacd1d6c4459f99fc"} Dec 01 11:29:45 crc kubenswrapper[4909]: I1201 11:29:45.593104 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" podStartSLOduration=2.052242417 podStartE2EDuration="2.593085848s" podCreationTimestamp="2025-12-01 11:29:43 +0000 UTC" firstStartedPulling="2025-12-01 11:29:43.8637724 +0000 UTC m=+3501.098243298" lastFinishedPulling="2025-12-01 11:29:44.404615831 +0000 UTC m=+3501.639086729" observedRunningTime="2025-12-01 11:29:45.583286598 +0000 UTC m=+3502.817757496" watchObservedRunningTime="2025-12-01 11:29:45.593085848 +0000 UTC m=+3502.827556746" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.145484 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n"] Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.147174 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.149970 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.151301 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.167509 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n"] Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.289889 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9klr\" (UniqueName: \"kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.289967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.290246 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.391432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.391784 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9klr\" (UniqueName: \"kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.391921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.392789 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.400206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.410847 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9klr\" (UniqueName: \"kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr\") pod \"collect-profiles-29409810-8r76n\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.471303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:00 crc kubenswrapper[4909]: W1201 11:30:00.899352 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81698d2e_daa0_4308_b090_3f8e265c9edb.slice/crio-9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b WatchSource:0}: Error finding container 9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b: Status 404 returned error can't find the container with id 9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b Dec 01 11:30:00 crc kubenswrapper[4909]: I1201 11:30:00.906427 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n"] Dec 01 11:30:01 crc kubenswrapper[4909]: I1201 11:30:01.720380 4909 generic.go:334] "Generic (PLEG): container finished" podID="81698d2e-daa0-4308-b090-3f8e265c9edb" containerID="ad65925a24642b14b6066cc98671fc91028ff4dc6fc228099d19036f235166fe" exitCode=0 Dec 01 11:30:01 crc kubenswrapper[4909]: I1201 11:30:01.720460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" event={"ID":"81698d2e-daa0-4308-b090-3f8e265c9edb","Type":"ContainerDied","Data":"ad65925a24642b14b6066cc98671fc91028ff4dc6fc228099d19036f235166fe"} Dec 01 11:30:01 crc kubenswrapper[4909]: I1201 11:30:01.721140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" event={"ID":"81698d2e-daa0-4308-b090-3f8e265c9edb","Type":"ContainerStarted","Data":"9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b"} Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.056764 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.140462 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume\") pod \"81698d2e-daa0-4308-b090-3f8e265c9edb\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.140721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume\") pod \"81698d2e-daa0-4308-b090-3f8e265c9edb\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.140835 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9klr\" (UniqueName: \"kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr\") pod \"81698d2e-daa0-4308-b090-3f8e265c9edb\" (UID: \"81698d2e-daa0-4308-b090-3f8e265c9edb\") " Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.141386 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume" (OuterVolumeSpecName: "config-volume") pod "81698d2e-daa0-4308-b090-3f8e265c9edb" (UID: "81698d2e-daa0-4308-b090-3f8e265c9edb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.146479 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr" (OuterVolumeSpecName: "kube-api-access-w9klr") pod "81698d2e-daa0-4308-b090-3f8e265c9edb" (UID: "81698d2e-daa0-4308-b090-3f8e265c9edb"). InnerVolumeSpecName "kube-api-access-w9klr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.146648 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81698d2e-daa0-4308-b090-3f8e265c9edb" (UID: "81698d2e-daa0-4308-b090-3f8e265c9edb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.243022 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81698d2e-daa0-4308-b090-3f8e265c9edb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.243056 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9klr\" (UniqueName: \"kubernetes.io/projected/81698d2e-daa0-4308-b090-3f8e265c9edb-kube-api-access-w9klr\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.243066 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81698d2e-daa0-4308-b090-3f8e265c9edb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.749775 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.749560 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-8r76n" event={"ID":"81698d2e-daa0-4308-b090-3f8e265c9edb","Type":"ContainerDied","Data":"9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b"} Dec 01 11:30:03 crc kubenswrapper[4909]: I1201 11:30:03.749909 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9754c04e60be68fa24187a42e015368efeecccfcf5522b4fa9b17491bd290b2b" Dec 01 11:30:04 crc kubenswrapper[4909]: I1201 11:30:04.129014 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9"] Dec 01 11:30:04 crc kubenswrapper[4909]: I1201 11:30:04.140781 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-sh7p9"] Dec 01 11:30:05 crc kubenswrapper[4909]: I1201 11:30:05.268299 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66988836-e4d6-497d-bab2-d52170d8d0ef" path="/var/lib/kubelet/pods/66988836-e4d6-497d-bab2-d52170d8d0ef/volumes" Dec 01 11:30:06 crc kubenswrapper[4909]: I1201 11:30:06.193216 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:30:06 crc kubenswrapper[4909]: I1201 11:30:06.193566 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:30:36 crc kubenswrapper[4909]: I1201 11:30:36.193691 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:30:36 crc kubenswrapper[4909]: I1201 11:30:36.194483 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:30:43 crc kubenswrapper[4909]: I1201 11:30:43.313656 4909 scope.go:117] "RemoveContainer" containerID="2826083d925be0994bc5dc78ccdbc853fca22a49abde0e4fef165ee1194f4710" Dec 01 11:31:06 crc kubenswrapper[4909]: I1201 11:31:06.193503 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:31:06 crc kubenswrapper[4909]: I1201 11:31:06.194042 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:31:06 crc kubenswrapper[4909]: I1201 11:31:06.194093 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:31:06 crc kubenswrapper[4909]: I1201 11:31:06.194740 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:31:06 crc kubenswrapper[4909]: I1201 11:31:06.194792 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" gracePeriod=600 Dec 01 11:31:06 crc kubenswrapper[4909]: E1201 11:31:06.312600 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:31:07 crc kubenswrapper[4909]: I1201 11:31:07.292377 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" exitCode=0 Dec 01 11:31:07 crc kubenswrapper[4909]: I1201 11:31:07.292448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406"} Dec 01 11:31:07 crc kubenswrapper[4909]: I1201 11:31:07.292507 4909 scope.go:117] "RemoveContainer" containerID="7cad518d7a6e1cb0d89b109487c8287ae615149610dc2400b5211242e8ae671d" Dec 01 11:31:07 crc kubenswrapper[4909]: I1201 11:31:07.293220 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:31:07 crc kubenswrapper[4909]: E1201 11:31:07.293608 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:31:20 crc kubenswrapper[4909]: I1201 11:31:20.257567 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:31:20 crc kubenswrapper[4909]: E1201 11:31:20.258394 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:31:35 crc kubenswrapper[4909]: I1201 11:31:35.256839 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:31:35 crc kubenswrapper[4909]: E1201 11:31:35.257500 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:31:46 crc kubenswrapper[4909]: I1201 11:31:46.257860 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:31:46 crc kubenswrapper[4909]: E1201 11:31:46.258646 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:32:00 crc kubenswrapper[4909]: I1201 11:32:00.257260 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:32:00 crc kubenswrapper[4909]: E1201 11:32:00.258216 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:32:14 crc kubenswrapper[4909]: I1201 11:32:14.257111 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:32:14 crc kubenswrapper[4909]: E1201 11:32:14.258009 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:32:25 crc kubenswrapper[4909]: I1201 11:32:25.256999 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:32:25 crc kubenswrapper[4909]: E1201 11:32:25.257532 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:32:38 crc kubenswrapper[4909]: I1201 11:32:38.256934 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:32:38 crc kubenswrapper[4909]: E1201 11:32:38.258019 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:32:51 crc kubenswrapper[4909]: I1201 11:32:51.256901 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:32:51 crc kubenswrapper[4909]: E1201 11:32:51.257642 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:33:04 crc kubenswrapper[4909]: I1201 11:33:04.257582 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:33:04 crc kubenswrapper[4909]: E1201 11:33:04.259164 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:33:18 crc kubenswrapper[4909]: I1201 11:33:18.257761 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:33:18 crc kubenswrapper[4909]: E1201 11:33:18.259191 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:33:31 crc kubenswrapper[4909]: I1201 11:33:31.260986 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:33:31 crc kubenswrapper[4909]: E1201 11:33:31.262893 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:33:44 crc kubenswrapper[4909]: I1201 11:33:44.257509 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:33:44 crc kubenswrapper[4909]: E1201 11:33:44.258598 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:33:57 crc kubenswrapper[4909]: I1201 11:33:57.257189 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:33:57 crc kubenswrapper[4909]: E1201 11:33:57.258222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:34:08 crc kubenswrapper[4909]: I1201 11:34:08.257022 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:34:08 crc kubenswrapper[4909]: E1201 11:34:08.257716 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:34:21 crc kubenswrapper[4909]: I1201 11:34:21.257218 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:34:21 crc kubenswrapper[4909]: E1201 11:34:21.258049 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:34:34 crc kubenswrapper[4909]: I1201 11:34:34.257396 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:34:34 crc kubenswrapper[4909]: E1201 11:34:34.258344 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:34:45 crc kubenswrapper[4909]: I1201 11:34:45.257190 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:34:45 crc kubenswrapper[4909]: E1201 11:34:45.258067 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:34:56 crc kubenswrapper[4909]: I1201 11:34:56.257802 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:34:56 crc kubenswrapper[4909]: E1201 11:34:56.259045 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:35:08 crc kubenswrapper[4909]: I1201 11:35:08.257726 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:35:08 crc kubenswrapper[4909]: E1201 11:35:08.258537 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:35:20 crc kubenswrapper[4909]: I1201 11:35:20.256896 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:35:20 crc kubenswrapper[4909]: E1201 11:35:20.257756 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:35:33 crc kubenswrapper[4909]: I1201 11:35:33.903216 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e92c763-dfb0-42ce-ae14-d0196b547985" containerID="cb5c1f26cffd368d35ec4bc438481870bf202530e52bc4bfacd1d6c4459f99fc" exitCode=2 Dec 01 11:35:33 crc kubenswrapper[4909]: I1201 11:35:33.903372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" event={"ID":"6e92c763-dfb0-42ce-ae14-d0196b547985","Type":"ContainerDied","Data":"cb5c1f26cffd368d35ec4bc438481870bf202530e52bc4bfacd1d6c4459f99fc"} Dec 01 11:35:34 crc kubenswrapper[4909]: I1201 11:35:34.257414 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:35:34 crc kubenswrapper[4909]: E1201 11:35:34.257862 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.415684 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476206 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476606 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476639 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476922 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsts4\" (UniqueName: \"kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.476954 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key\") pod \"6e92c763-dfb0-42ce-ae14-d0196b547985\" (UID: \"6e92c763-dfb0-42ce-ae14-d0196b547985\") " Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.481779 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph" (OuterVolumeSpecName: "ceph") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.483743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4" (OuterVolumeSpecName: "kube-api-access-xsts4") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "kube-api-access-xsts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.486015 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.503695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.509780 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.511787 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory" (OuterVolumeSpecName: "inventory") pod "6e92c763-dfb0-42ce-ae14-d0196b547985" (UID: "6e92c763-dfb0-42ce-ae14-d0196b547985"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578850 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsts4\" (UniqueName: \"kubernetes.io/projected/6e92c763-dfb0-42ce-ae14-d0196b547985-kube-api-access-xsts4\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578906 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578917 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578926 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578959 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.578968 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e92c763-dfb0-42ce-ae14-d0196b547985-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.919190 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" event={"ID":"6e92c763-dfb0-42ce-ae14-d0196b547985","Type":"ContainerDied","Data":"d6a6b9df1044fdc3ee25459ced18984be38c86d9ea5d201dcf4050368514fac0"} Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.919233 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6a6b9df1044fdc3ee25459ced18984be38c86d9ea5d201dcf4050368514fac0" Dec 01 11:35:35 crc kubenswrapper[4909]: I1201 11:35:35.919246 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-v9khp" Dec 01 11:35:47 crc kubenswrapper[4909]: I1201 11:35:47.257148 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:35:47 crc kubenswrapper[4909]: E1201 11:35:47.257839 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:36:00 crc kubenswrapper[4909]: I1201 11:36:00.257699 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:36:00 crc kubenswrapper[4909]: E1201 11:36:00.258465 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:36:15 crc kubenswrapper[4909]: I1201 11:36:15.257216 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:36:16 crc kubenswrapper[4909]: I1201 11:36:16.350133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8"} Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.047990 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:42 crc kubenswrapper[4909]: E1201 11:36:42.050349 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e92c763-dfb0-42ce-ae14-d0196b547985" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.050475 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e92c763-dfb0-42ce-ae14-d0196b547985" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:36:42 crc kubenswrapper[4909]: E1201 11:36:42.050591 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81698d2e-daa0-4308-b090-3f8e265c9edb" containerName="collect-profiles" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.050669 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81698d2e-daa0-4308-b090-3f8e265c9edb" containerName="collect-profiles" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.051021 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="81698d2e-daa0-4308-b090-3f8e265c9edb" containerName="collect-profiles" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.051128 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e92c763-dfb0-42ce-ae14-d0196b547985" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.053038 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.071794 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.213922 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.214066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.214150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh4zs\" (UniqueName: \"kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.315849 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.315961 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.316069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh4zs\" (UniqueName: \"kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.316366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.316483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.337727 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh4zs\" (UniqueName: \"kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs\") pod \"redhat-marketplace-wvqmx\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.382583 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:42 crc kubenswrapper[4909]: I1201 11:36:42.871407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:43 crc kubenswrapper[4909]: I1201 11:36:43.589257 4909 generic.go:334] "Generic (PLEG): container finished" podID="66b051e7-3e14-4cfe-8737-0c4035127110" containerID="797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18" exitCode=0 Dec 01 11:36:43 crc kubenswrapper[4909]: I1201 11:36:43.589357 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerDied","Data":"797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18"} Dec 01 11:36:43 crc kubenswrapper[4909]: I1201 11:36:43.589521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerStarted","Data":"033c74808b53625fa432a2e2a29974fd4cad11e8099df41475e4b7ae60d57efd"} Dec 01 11:36:43 crc kubenswrapper[4909]: I1201 11:36:43.592247 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:36:45 crc kubenswrapper[4909]: I1201 11:36:45.608669 4909 generic.go:334] "Generic (PLEG): container finished" podID="66b051e7-3e14-4cfe-8737-0c4035127110" containerID="0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef" exitCode=0 Dec 01 11:36:45 crc kubenswrapper[4909]: I1201 11:36:45.608730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerDied","Data":"0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef"} Dec 01 11:36:46 crc kubenswrapper[4909]: I1201 11:36:46.618254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerStarted","Data":"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f"} Dec 01 11:36:46 crc kubenswrapper[4909]: I1201 11:36:46.639303 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wvqmx" podStartSLOduration=2.09968141 podStartE2EDuration="4.639286404s" podCreationTimestamp="2025-12-01 11:36:42 +0000 UTC" firstStartedPulling="2025-12-01 11:36:43.591977914 +0000 UTC m=+3920.826448812" lastFinishedPulling="2025-12-01 11:36:46.131582888 +0000 UTC m=+3923.366053806" observedRunningTime="2025-12-01 11:36:46.634508226 +0000 UTC m=+3923.868979124" watchObservedRunningTime="2025-12-01 11:36:46.639286404 +0000 UTC m=+3923.873757312" Dec 01 11:36:52 crc kubenswrapper[4909]: I1201 11:36:52.382989 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:52 crc kubenswrapper[4909]: I1201 11:36:52.383535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:52 crc kubenswrapper[4909]: I1201 11:36:52.457367 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:52 crc kubenswrapper[4909]: I1201 11:36:52.753375 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:52 crc kubenswrapper[4909]: I1201 11:36:52.814362 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.032830 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr"] Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.034496 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.038131 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.038555 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.038762 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.038811 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.039017 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.039177 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.048748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr"] Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.129978 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.130029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.130449 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2jr\" (UniqueName: \"kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.130615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.130693 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.130781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2jr\" (UniqueName: \"kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232713 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.232766 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.238641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.238711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.239169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.239304 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.241219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.257204 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2jr\" (UniqueName: \"kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.363598 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:36:53 crc kubenswrapper[4909]: I1201 11:36:53.866028 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr"] Dec 01 11:36:54 crc kubenswrapper[4909]: I1201 11:36:54.704523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" event={"ID":"fb02b769-e281-4b1d-8bdc-b414fa58587f","Type":"ContainerStarted","Data":"ad1e1b8456d775b4430e06318de6f6047da5949d8b08372974bc9b3607ac4945"} Dec 01 11:36:54 crc kubenswrapper[4909]: I1201 11:36:54.704840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" event={"ID":"fb02b769-e281-4b1d-8bdc-b414fa58587f","Type":"ContainerStarted","Data":"bbff175c1806f268817c07dda3da58b15b401d569296c70d804b83a37636cef4"} Dec 01 11:36:54 crc kubenswrapper[4909]: I1201 11:36:54.704639 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wvqmx" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="registry-server" containerID="cri-o://d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f" gracePeriod=2 Dec 01 11:36:54 crc kubenswrapper[4909]: I1201 11:36:54.735495 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" podStartSLOduration=1.295448122 podStartE2EDuration="1.735468693s" podCreationTimestamp="2025-12-01 11:36:53 +0000 UTC" firstStartedPulling="2025-12-01 11:36:53.867101653 +0000 UTC m=+3931.101572551" lastFinishedPulling="2025-12-01 11:36:54.307122224 +0000 UTC m=+3931.541593122" observedRunningTime="2025-12-01 11:36:54.729653843 +0000 UTC m=+3931.964124741" watchObservedRunningTime="2025-12-01 11:36:54.735468693 +0000 UTC m=+3931.969939611" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.138501 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.290654 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities\") pod \"66b051e7-3e14-4cfe-8737-0c4035127110\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.290725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content\") pod \"66b051e7-3e14-4cfe-8737-0c4035127110\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.290772 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh4zs\" (UniqueName: \"kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs\") pod \"66b051e7-3e14-4cfe-8737-0c4035127110\" (UID: \"66b051e7-3e14-4cfe-8737-0c4035127110\") " Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.292312 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities" (OuterVolumeSpecName: "utilities") pod "66b051e7-3e14-4cfe-8737-0c4035127110" (UID: "66b051e7-3e14-4cfe-8737-0c4035127110"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.295727 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.296940 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs" (OuterVolumeSpecName: "kube-api-access-kh4zs") pod "66b051e7-3e14-4cfe-8737-0c4035127110" (UID: "66b051e7-3e14-4cfe-8737-0c4035127110"). InnerVolumeSpecName "kube-api-access-kh4zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.321136 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66b051e7-3e14-4cfe-8737-0c4035127110" (UID: "66b051e7-3e14-4cfe-8737-0c4035127110"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.397239 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b051e7-3e14-4cfe-8737-0c4035127110-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.397279 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh4zs\" (UniqueName: \"kubernetes.io/projected/66b051e7-3e14-4cfe-8737-0c4035127110-kube-api-access-kh4zs\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.527015 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.529003 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="extract-utilities" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.529036 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="extract-utilities" Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.529073 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="extract-content" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.529084 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="extract-content" Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.529096 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="registry-server" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.529103 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="registry-server" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.529338 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" containerName="registry-server" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.531708 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.535843 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.601401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.601722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.601927 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65wn\" (UniqueName: \"kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.703286 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.703343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.703387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65wn\" (UniqueName: \"kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.703806 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.703968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.715224 4909 generic.go:334] "Generic (PLEG): container finished" podID="66b051e7-3e14-4cfe-8737-0c4035127110" containerID="d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f" exitCode=0 Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.715313 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvqmx" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.715316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerDied","Data":"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f"} Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.715595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvqmx" event={"ID":"66b051e7-3e14-4cfe-8737-0c4035127110","Type":"ContainerDied","Data":"033c74808b53625fa432a2e2a29974fd4cad11e8099df41475e4b7ae60d57efd"} Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.715628 4909 scope.go:117] "RemoveContainer" containerID="d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.724071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65wn\" (UniqueName: \"kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn\") pod \"redhat-operators-csz66\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.776290 4909 scope.go:117] "RemoveContainer" containerID="0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.785607 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.795784 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvqmx"] Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.796140 4909 scope.go:117] "RemoveContainer" containerID="797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.845349 4909 scope.go:117] "RemoveContainer" containerID="d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f" Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.845957 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f\": container with ID starting with d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f not found: ID does not exist" containerID="d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.846009 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f"} err="failed to get container status \"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f\": rpc error: code = NotFound desc = could not find container \"d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f\": container with ID starting with d1c297e53e7a37c935e4861103b0c150228d0648d016eecb59cae23aa6828e5f not found: ID does not exist" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.846050 4909 scope.go:117] "RemoveContainer" containerID="0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef" Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.847408 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef\": container with ID starting with 0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef not found: ID does not exist" containerID="0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.847447 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef"} err="failed to get container status \"0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef\": rpc error: code = NotFound desc = could not find container \"0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef\": container with ID starting with 0bdee7062a997c217afaff34c52636d235352e255c8e9c58ea954c2c07d0bcef not found: ID does not exist" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.847471 4909 scope.go:117] "RemoveContainer" containerID="797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18" Dec 01 11:36:55 crc kubenswrapper[4909]: E1201 11:36:55.847744 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18\": container with ID starting with 797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18 not found: ID does not exist" containerID="797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.847766 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18"} err="failed to get container status \"797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18\": rpc error: code = NotFound desc = could not find container \"797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18\": container with ID starting with 797560c7718fe4866da2135d7011bbf5f85f96af6641e63600ae8565d515ce18 not found: ID does not exist" Dec 01 11:36:55 crc kubenswrapper[4909]: I1201 11:36:55.867525 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:36:56 crc kubenswrapper[4909]: I1201 11:36:56.384584 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:36:56 crc kubenswrapper[4909]: W1201 11:36:56.391447 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ee1749_6194_49df_bed2_8c29f0ed4b0d.slice/crio-c826761f986ddaccad338c5667f8ce7f96a033004c7f53e4434c3bfe8b134106 WatchSource:0}: Error finding container c826761f986ddaccad338c5667f8ce7f96a033004c7f53e4434c3bfe8b134106: Status 404 returned error can't find the container with id c826761f986ddaccad338c5667f8ce7f96a033004c7f53e4434c3bfe8b134106 Dec 01 11:36:56 crc kubenswrapper[4909]: I1201 11:36:56.727476 4909 generic.go:334] "Generic (PLEG): container finished" podID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerID="92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7" exitCode=0 Dec 01 11:36:56 crc kubenswrapper[4909]: I1201 11:36:56.729016 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerDied","Data":"92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7"} Dec 01 11:36:56 crc kubenswrapper[4909]: I1201 11:36:56.729074 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerStarted","Data":"c826761f986ddaccad338c5667f8ce7f96a033004c7f53e4434c3bfe8b134106"} Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.273478 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b051e7-3e14-4cfe-8737-0c4035127110" path="/var/lib/kubelet/pods/66b051e7-3e14-4cfe-8737-0c4035127110/volumes" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.735149 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.739316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.745446 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.749943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerStarted","Data":"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774"} Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.866505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.866792 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pvw\" (UniqueName: \"kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.866961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.970219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.970403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.970463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pvw\" (UniqueName: \"kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.976247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.976345 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:57 crc kubenswrapper[4909]: I1201 11:36:57.992954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pvw\" (UniqueName: \"kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw\") pod \"community-operators-skc74\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:58 crc kubenswrapper[4909]: I1201 11:36:58.116371 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:36:58 crc kubenswrapper[4909]: I1201 11:36:58.663479 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:36:58 crc kubenswrapper[4909]: I1201 11:36:58.761965 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerStarted","Data":"dbad919e23b462813bf675a67c83810e109317d01f2a7d66543c98356ae19a57"} Dec 01 11:36:59 crc kubenswrapper[4909]: I1201 11:36:59.773663 4909 generic.go:334] "Generic (PLEG): container finished" podID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerID="238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774" exitCode=0 Dec 01 11:36:59 crc kubenswrapper[4909]: I1201 11:36:59.773742 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerDied","Data":"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774"} Dec 01 11:36:59 crc kubenswrapper[4909]: I1201 11:36:59.776095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerStarted","Data":"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d"} Dec 01 11:37:00 crc kubenswrapper[4909]: I1201 11:37:00.788202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerStarted","Data":"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a"} Dec 01 11:37:00 crc kubenswrapper[4909]: I1201 11:37:00.790338 4909 generic.go:334] "Generic (PLEG): container finished" podID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerID="bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d" exitCode=0 Dec 01 11:37:00 crc kubenswrapper[4909]: I1201 11:37:00.790380 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerDied","Data":"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d"} Dec 01 11:37:00 crc kubenswrapper[4909]: I1201 11:37:00.818372 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csz66" podStartSLOduration=2.011653563 podStartE2EDuration="5.818351061s" podCreationTimestamp="2025-12-01 11:36:55 +0000 UTC" firstStartedPulling="2025-12-01 11:36:56.731305325 +0000 UTC m=+3933.965776223" lastFinishedPulling="2025-12-01 11:37:00.538002823 +0000 UTC m=+3937.772473721" observedRunningTime="2025-12-01 11:37:00.808030822 +0000 UTC m=+3938.042501740" watchObservedRunningTime="2025-12-01 11:37:00.818351061 +0000 UTC m=+3938.052821959" Dec 01 11:37:01 crc kubenswrapper[4909]: I1201 11:37:01.801257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerStarted","Data":"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b"} Dec 01 11:37:02 crc kubenswrapper[4909]: I1201 11:37:02.812093 4909 generic.go:334] "Generic (PLEG): container finished" podID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerID="96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b" exitCode=0 Dec 01 11:37:02 crc kubenswrapper[4909]: I1201 11:37:02.812302 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerDied","Data":"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b"} Dec 01 11:37:03 crc kubenswrapper[4909]: I1201 11:37:03.822373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerStarted","Data":"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d"} Dec 01 11:37:03 crc kubenswrapper[4909]: I1201 11:37:03.842152 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-skc74" podStartSLOduration=4.395724924 podStartE2EDuration="6.842132192s" podCreationTimestamp="2025-12-01 11:36:57 +0000 UTC" firstStartedPulling="2025-12-01 11:37:00.792943485 +0000 UTC m=+3938.027414373" lastFinishedPulling="2025-12-01 11:37:03.239350743 +0000 UTC m=+3940.473821641" observedRunningTime="2025-12-01 11:37:03.839056466 +0000 UTC m=+3941.073527374" watchObservedRunningTime="2025-12-01 11:37:03.842132192 +0000 UTC m=+3941.076603090" Dec 01 11:37:05 crc kubenswrapper[4909]: I1201 11:37:05.868339 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:05 crc kubenswrapper[4909]: I1201 11:37:05.868634 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:05 crc kubenswrapper[4909]: I1201 11:37:05.912248 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:06 crc kubenswrapper[4909]: I1201 11:37:06.895575 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:07 crc kubenswrapper[4909]: I1201 11:37:07.906270 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:37:08 crc kubenswrapper[4909]: I1201 11:37:08.116654 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:08 crc kubenswrapper[4909]: I1201 11:37:08.116707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:08 crc kubenswrapper[4909]: I1201 11:37:08.164560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:08 crc kubenswrapper[4909]: I1201 11:37:08.862702 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csz66" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="registry-server" containerID="cri-o://1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a" gracePeriod=2 Dec 01 11:37:08 crc kubenswrapper[4909]: I1201 11:37:08.909432 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.323222 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.416804 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content\") pod \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.416900 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65wn\" (UniqueName: \"kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn\") pod \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.417133 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities\") pod \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\" (UID: \"15ee1749-6194-49df-bed2-8c29f0ed4b0d\") " Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.418136 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities" (OuterVolumeSpecName: "utilities") pod "15ee1749-6194-49df-bed2-8c29f0ed4b0d" (UID: "15ee1749-6194-49df-bed2-8c29f0ed4b0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.424464 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn" (OuterVolumeSpecName: "kube-api-access-w65wn") pod "15ee1749-6194-49df-bed2-8c29f0ed4b0d" (UID: "15ee1749-6194-49df-bed2-8c29f0ed4b0d"). InnerVolumeSpecName "kube-api-access-w65wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.514306 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15ee1749-6194-49df-bed2-8c29f0ed4b0d" (UID: "15ee1749-6194-49df-bed2-8c29f0ed4b0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.519291 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.519332 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65wn\" (UniqueName: \"kubernetes.io/projected/15ee1749-6194-49df-bed2-8c29f0ed4b0d-kube-api-access-w65wn\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.519345 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15ee1749-6194-49df-bed2-8c29f0ed4b0d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.876089 4909 generic.go:334] "Generic (PLEG): container finished" podID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerID="1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a" exitCode=0 Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.876171 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csz66" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.876171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerDied","Data":"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a"} Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.876229 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csz66" event={"ID":"15ee1749-6194-49df-bed2-8c29f0ed4b0d","Type":"ContainerDied","Data":"c826761f986ddaccad338c5667f8ce7f96a033004c7f53e4434c3bfe8b134106"} Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.876256 4909 scope.go:117] "RemoveContainer" containerID="1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.904270 4909 scope.go:117] "RemoveContainer" containerID="238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.920409 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.928671 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csz66"] Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.957105 4909 scope.go:117] "RemoveContainer" containerID="92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.996398 4909 scope.go:117] "RemoveContainer" containerID="1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a" Dec 01 11:37:09 crc kubenswrapper[4909]: E1201 11:37:09.997242 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a\": container with ID starting with 1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a not found: ID does not exist" containerID="1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.997296 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a"} err="failed to get container status \"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a\": rpc error: code = NotFound desc = could not find container \"1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a\": container with ID starting with 1395f1d5c242c378c879486b78c81acc576f4e723a96cdfa1dbcbea724b3118a not found: ID does not exist" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.997329 4909 scope.go:117] "RemoveContainer" containerID="238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774" Dec 01 11:37:09 crc kubenswrapper[4909]: E1201 11:37:09.997706 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774\": container with ID starting with 238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774 not found: ID does not exist" containerID="238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.997731 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774"} err="failed to get container status \"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774\": rpc error: code = NotFound desc = could not find container \"238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774\": container with ID starting with 238ff5e9ce52788ebc30b607697e9ec26422ac928c5c9227296c90f5718d6774 not found: ID does not exist" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.997748 4909 scope.go:117] "RemoveContainer" containerID="92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7" Dec 01 11:37:09 crc kubenswrapper[4909]: E1201 11:37:09.998092 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7\": container with ID starting with 92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7 not found: ID does not exist" containerID="92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7" Dec 01 11:37:09 crc kubenswrapper[4909]: I1201 11:37:09.998119 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7"} err="failed to get container status \"92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7\": rpc error: code = NotFound desc = could not find container \"92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7\": container with ID starting with 92bc210692270619804043e0df08d279edfa12a72e049f8adf440913e287e1f7 not found: ID does not exist" Dec 01 11:37:10 crc kubenswrapper[4909]: I1201 11:37:10.521676 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:37:10 crc kubenswrapper[4909]: I1201 11:37:10.888540 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-skc74" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="registry-server" containerID="cri-o://09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d" gracePeriod=2 Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.266820 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" path="/var/lib/kubelet/pods/15ee1749-6194-49df-bed2-8c29f0ed4b0d/volumes" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.363370 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.453651 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6pvw\" (UniqueName: \"kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw\") pod \"801925f6-79c0-422d-8ddf-4c139597a0dc\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.454192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content\") pod \"801925f6-79c0-422d-8ddf-4c139597a0dc\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.454296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities\") pod \"801925f6-79c0-422d-8ddf-4c139597a0dc\" (UID: \"801925f6-79c0-422d-8ddf-4c139597a0dc\") " Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.455518 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities" (OuterVolumeSpecName: "utilities") pod "801925f6-79c0-422d-8ddf-4c139597a0dc" (UID: "801925f6-79c0-422d-8ddf-4c139597a0dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.465176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw" (OuterVolumeSpecName: "kube-api-access-g6pvw") pod "801925f6-79c0-422d-8ddf-4c139597a0dc" (UID: "801925f6-79c0-422d-8ddf-4c139597a0dc"). InnerVolumeSpecName "kube-api-access-g6pvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.498421 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "801925f6-79c0-422d-8ddf-4c139597a0dc" (UID: "801925f6-79c0-422d-8ddf-4c139597a0dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.555798 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6pvw\" (UniqueName: \"kubernetes.io/projected/801925f6-79c0-422d-8ddf-4c139597a0dc-kube-api-access-g6pvw\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.555830 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.555839 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801925f6-79c0-422d-8ddf-4c139597a0dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.898224 4909 generic.go:334] "Generic (PLEG): container finished" podID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerID="09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d" exitCode=0 Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.898265 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerDied","Data":"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d"} Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.898279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skc74" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.898288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skc74" event={"ID":"801925f6-79c0-422d-8ddf-4c139597a0dc","Type":"ContainerDied","Data":"dbad919e23b462813bf675a67c83810e109317d01f2a7d66543c98356ae19a57"} Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.898303 4909 scope.go:117] "RemoveContainer" containerID="09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.932093 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.937693 4909 scope.go:117] "RemoveContainer" containerID="96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b" Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.940744 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-skc74"] Dec 01 11:37:11 crc kubenswrapper[4909]: I1201 11:37:11.976959 4909 scope.go:117] "RemoveContainer" containerID="bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.009727 4909 scope.go:117] "RemoveContainer" containerID="09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d" Dec 01 11:37:12 crc kubenswrapper[4909]: E1201 11:37:12.010615 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d\": container with ID starting with 09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d not found: ID does not exist" containerID="09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.010694 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d"} err="failed to get container status \"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d\": rpc error: code = NotFound desc = could not find container \"09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d\": container with ID starting with 09583e22e67fe4e9b2056676787dbe9610b02f400897c332c4d2d04b930a531d not found: ID does not exist" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.010754 4909 scope.go:117] "RemoveContainer" containerID="96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b" Dec 01 11:37:12 crc kubenswrapper[4909]: E1201 11:37:12.011521 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b\": container with ID starting with 96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b not found: ID does not exist" containerID="96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.011555 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b"} err="failed to get container status \"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b\": rpc error: code = NotFound desc = could not find container \"96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b\": container with ID starting with 96a15a791705079ed42999026b61b3ec248d57ca486d56540565391a999af87b not found: ID does not exist" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.011579 4909 scope.go:117] "RemoveContainer" containerID="bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d" Dec 01 11:37:12 crc kubenswrapper[4909]: E1201 11:37:12.011997 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d\": container with ID starting with bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d not found: ID does not exist" containerID="bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d" Dec 01 11:37:12 crc kubenswrapper[4909]: I1201 11:37:12.012069 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d"} err="failed to get container status \"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d\": rpc error: code = NotFound desc = could not find container \"bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d\": container with ID starting with bc5d4c3007ae33c1cb21998b4d7908e80093488ee69a30870ae9cc3100e3c50d not found: ID does not exist" Dec 01 11:37:13 crc kubenswrapper[4909]: I1201 11:37:13.274609 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" path="/var/lib/kubelet/pods/801925f6-79c0-422d-8ddf-4c139597a0dc/volumes" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.796297 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797176 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="extract-utilities" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797191 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="extract-utilities" Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797214 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="extract-utilities" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797222 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="extract-utilities" Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797242 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="extract-content" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797253 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="extract-content" Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797287 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797295 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797310 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="extract-content" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797318 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="extract-content" Dec 01 11:37:42 crc kubenswrapper[4909]: E1201 11:37:42.797332 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797340 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797529 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ee1749-6194-49df-bed2-8c29f0ed4b0d" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.797550 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="801925f6-79c0-422d-8ddf-4c139597a0dc" containerName="registry-server" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.799155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.821074 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.874682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.875019 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.875259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkpql\" (UniqueName: \"kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.976666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.976759 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkpql\" (UniqueName: \"kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.976800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.977287 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.977305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:42 crc kubenswrapper[4909]: I1201 11:37:42.996983 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkpql\" (UniqueName: \"kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql\") pod \"certified-operators-ltztp\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:43 crc kubenswrapper[4909]: I1201 11:37:43.172758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:43 crc kubenswrapper[4909]: I1201 11:37:43.655229 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:44 crc kubenswrapper[4909]: I1201 11:37:44.167232 4909 generic.go:334] "Generic (PLEG): container finished" podID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerID="28501e256b1a89ebd2cf54488a9f351b3663b8b74f7d0c86143dbf99798c7f22" exitCode=0 Dec 01 11:37:44 crc kubenswrapper[4909]: I1201 11:37:44.167357 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerDied","Data":"28501e256b1a89ebd2cf54488a9f351b3663b8b74f7d0c86143dbf99798c7f22"} Dec 01 11:37:44 crc kubenswrapper[4909]: I1201 11:37:44.167648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerStarted","Data":"75ca0b67209b9af138ec31ed6dd5229a536b99545f6179e93f3104c521fa65a0"} Dec 01 11:37:45 crc kubenswrapper[4909]: I1201 11:37:45.179334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerStarted","Data":"0497cf7fe70200b552be68f0df4d479fb6498e779ab4743155f4cac73b2ffd26"} Dec 01 11:37:46 crc kubenswrapper[4909]: I1201 11:37:46.192060 4909 generic.go:334] "Generic (PLEG): container finished" podID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerID="0497cf7fe70200b552be68f0df4d479fb6498e779ab4743155f4cac73b2ffd26" exitCode=0 Dec 01 11:37:46 crc kubenswrapper[4909]: I1201 11:37:46.192184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerDied","Data":"0497cf7fe70200b552be68f0df4d479fb6498e779ab4743155f4cac73b2ffd26"} Dec 01 11:37:47 crc kubenswrapper[4909]: I1201 11:37:47.202308 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerStarted","Data":"c1060b4c2aa38532e8722bbb39c3fbe79a645080be108403545d62a2aa79d991"} Dec 01 11:37:47 crc kubenswrapper[4909]: I1201 11:37:47.220648 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ltztp" podStartSLOduration=2.717012331 podStartE2EDuration="5.220632102s" podCreationTimestamp="2025-12-01 11:37:42 +0000 UTC" firstStartedPulling="2025-12-01 11:37:44.170150373 +0000 UTC m=+3981.404621301" lastFinishedPulling="2025-12-01 11:37:46.673770174 +0000 UTC m=+3983.908241072" observedRunningTime="2025-12-01 11:37:47.218600709 +0000 UTC m=+3984.453071607" watchObservedRunningTime="2025-12-01 11:37:47.220632102 +0000 UTC m=+3984.455103000" Dec 01 11:37:53 crc kubenswrapper[4909]: I1201 11:37:53.173114 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:53 crc kubenswrapper[4909]: I1201 11:37:53.174184 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:53 crc kubenswrapper[4909]: I1201 11:37:53.282078 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:53 crc kubenswrapper[4909]: I1201 11:37:53.332024 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:53 crc kubenswrapper[4909]: I1201 11:37:53.522044 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:55 crc kubenswrapper[4909]: I1201 11:37:55.289360 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ltztp" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="registry-server" containerID="cri-o://c1060b4c2aa38532e8722bbb39c3fbe79a645080be108403545d62a2aa79d991" gracePeriod=2 Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.303074 4909 generic.go:334] "Generic (PLEG): container finished" podID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerID="c1060b4c2aa38532e8722bbb39c3fbe79a645080be108403545d62a2aa79d991" exitCode=0 Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.303126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerDied","Data":"c1060b4c2aa38532e8722bbb39c3fbe79a645080be108403545d62a2aa79d991"} Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.427956 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.553775 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities\") pod \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.554032 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkpql\" (UniqueName: \"kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql\") pod \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.554191 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content\") pod \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\" (UID: \"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7\") " Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.554790 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities" (OuterVolumeSpecName: "utilities") pod "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" (UID: "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.564342 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql" (OuterVolumeSpecName: "kube-api-access-kkpql") pod "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" (UID: "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7"). InnerVolumeSpecName "kube-api-access-kkpql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.598919 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" (UID: "f05f63ff-8f5d-4502-a78c-9cc6332f6cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.656249 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.656283 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkpql\" (UniqueName: \"kubernetes.io/projected/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-kube-api-access-kkpql\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:56 crc kubenswrapper[4909]: I1201 11:37:56.656294 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.319299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltztp" event={"ID":"f05f63ff-8f5d-4502-a78c-9cc6332f6cb7","Type":"ContainerDied","Data":"75ca0b67209b9af138ec31ed6dd5229a536b99545f6179e93f3104c521fa65a0"} Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.319362 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltztp" Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.319842 4909 scope.go:117] "RemoveContainer" containerID="c1060b4c2aa38532e8722bbb39c3fbe79a645080be108403545d62a2aa79d991" Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.349984 4909 scope.go:117] "RemoveContainer" containerID="0497cf7fe70200b552be68f0df4d479fb6498e779ab4743155f4cac73b2ffd26" Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.355228 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.379348 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ltztp"] Dec 01 11:37:57 crc kubenswrapper[4909]: I1201 11:37:57.387274 4909 scope.go:117] "RemoveContainer" containerID="28501e256b1a89ebd2cf54488a9f351b3663b8b74f7d0c86143dbf99798c7f22" Dec 01 11:37:59 crc kubenswrapper[4909]: I1201 11:37:59.278132 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" path="/var/lib/kubelet/pods/f05f63ff-8f5d-4502-a78c-9cc6332f6cb7/volumes" Dec 01 11:38:36 crc kubenswrapper[4909]: I1201 11:38:36.194177 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:38:36 crc kubenswrapper[4909]: I1201 11:38:36.195422 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:39:06 crc kubenswrapper[4909]: I1201 11:39:06.194407 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:39:06 crc kubenswrapper[4909]: I1201 11:39:06.194825 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:39:36 crc kubenswrapper[4909]: I1201 11:39:36.193202 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:39:36 crc kubenswrapper[4909]: I1201 11:39:36.193832 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:39:36 crc kubenswrapper[4909]: I1201 11:39:36.193907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:39:36 crc kubenswrapper[4909]: I1201 11:39:36.194692 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:39:36 crc kubenswrapper[4909]: I1201 11:39:36.194762 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8" gracePeriod=600 Dec 01 11:39:37 crc kubenswrapper[4909]: I1201 11:39:37.152952 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8" exitCode=0 Dec 01 11:39:37 crc kubenswrapper[4909]: I1201 11:39:37.153032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8"} Dec 01 11:39:37 crc kubenswrapper[4909]: I1201 11:39:37.153416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2"} Dec 01 11:39:37 crc kubenswrapper[4909]: I1201 11:39:37.153436 4909 scope.go:117] "RemoveContainer" containerID="68412701fb13f24d706156b48c6a1781a7c3284fd67525e7333b237c9c012406" Dec 01 11:41:36 crc kubenswrapper[4909]: I1201 11:41:36.193654 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:41:36 crc kubenswrapper[4909]: I1201 11:41:36.194211 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:06 crc kubenswrapper[4909]: I1201 11:42:06.193837 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:42:06 crc kubenswrapper[4909]: I1201 11:42:06.194569 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.193662 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.194440 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.194501 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.195538 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.195608 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" gracePeriod=600 Dec 01 11:42:36 crc kubenswrapper[4909]: E1201 11:42:36.329132 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.698464 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" exitCode=0 Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.698909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2"} Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.698943 4909 scope.go:117] "RemoveContainer" containerID="74b7b026b70e8ab5e9f6682b27a220aa61db7324065195ffb38680f8406276f8" Dec 01 11:42:36 crc kubenswrapper[4909]: I1201 11:42:36.699511 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:42:36 crc kubenswrapper[4909]: E1201 11:42:36.699726 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:42:45 crc kubenswrapper[4909]: I1201 11:42:45.776988 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb02b769-e281-4b1d-8bdc-b414fa58587f" containerID="ad1e1b8456d775b4430e06318de6f6047da5949d8b08372974bc9b3607ac4945" exitCode=2 Dec 01 11:42:45 crc kubenswrapper[4909]: I1201 11:42:45.777064 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" event={"ID":"fb02b769-e281-4b1d-8bdc-b414fa58587f","Type":"ContainerDied","Data":"ad1e1b8456d775b4430e06318de6f6047da5949d8b08372974bc9b3607ac4945"} Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.160436 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.222197 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.222320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.222736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.223020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.223271 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn2jr\" (UniqueName: \"kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.223310 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory\") pod \"fb02b769-e281-4b1d-8bdc-b414fa58587f\" (UID: \"fb02b769-e281-4b1d-8bdc-b414fa58587f\") " Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.228591 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.236116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr" (OuterVolumeSpecName: "kube-api-access-pn2jr") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "kube-api-access-pn2jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.237358 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph" (OuterVolumeSpecName: "ceph") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.250419 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.254004 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.254834 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory" (OuterVolumeSpecName: "inventory") pod "fb02b769-e281-4b1d-8bdc-b414fa58587f" (UID: "fb02b769-e281-4b1d-8bdc-b414fa58587f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325668 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn2jr\" (UniqueName: \"kubernetes.io/projected/fb02b769-e281-4b1d-8bdc-b414fa58587f-kube-api-access-pn2jr\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325707 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325720 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325734 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325745 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.325759 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb02b769-e281-4b1d-8bdc-b414fa58587f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.796159 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" event={"ID":"fb02b769-e281-4b1d-8bdc-b414fa58587f","Type":"ContainerDied","Data":"bbff175c1806f268817c07dda3da58b15b401d569296c70d804b83a37636cef4"} Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.796522 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbff175c1806f268817c07dda3da58b15b401d569296c70d804b83a37636cef4" Dec 01 11:42:47 crc kubenswrapper[4909]: I1201 11:42:47.796216 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr" Dec 01 11:42:50 crc kubenswrapper[4909]: I1201 11:42:50.257163 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:42:50 crc kubenswrapper[4909]: E1201 11:42:50.257700 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:43:03 crc kubenswrapper[4909]: I1201 11:43:03.267193 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:43:03 crc kubenswrapper[4909]: E1201 11:43:03.269237 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:43:17 crc kubenswrapper[4909]: I1201 11:43:17.258343 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:43:17 crc kubenswrapper[4909]: E1201 11:43:17.259175 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:43:29 crc kubenswrapper[4909]: I1201 11:43:29.256833 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:43:29 crc kubenswrapper[4909]: E1201 11:43:29.257576 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:43:41 crc kubenswrapper[4909]: I1201 11:43:41.258330 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:43:41 crc kubenswrapper[4909]: E1201 11:43:41.259162 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:43:54 crc kubenswrapper[4909]: I1201 11:43:54.256673 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:43:54 crc kubenswrapper[4909]: E1201 11:43:54.257388 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:44:06 crc kubenswrapper[4909]: I1201 11:44:06.257823 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:44:06 crc kubenswrapper[4909]: E1201 11:44:06.258660 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:44:17 crc kubenswrapper[4909]: I1201 11:44:17.256966 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:44:17 crc kubenswrapper[4909]: E1201 11:44:17.257700 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:44:32 crc kubenswrapper[4909]: I1201 11:44:32.258219 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:44:32 crc kubenswrapper[4909]: E1201 11:44:32.260052 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:44:46 crc kubenswrapper[4909]: I1201 11:44:46.257439 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:44:46 crc kubenswrapper[4909]: E1201 11:44:46.258213 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:44:59 crc kubenswrapper[4909]: I1201 11:44:59.258000 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:44:59 crc kubenswrapper[4909]: E1201 11:44:59.258720 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.152722 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l"] Dec 01 11:45:00 crc kubenswrapper[4909]: E1201 11:45:00.153502 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="extract-utilities" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153524 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="extract-utilities" Dec 01 11:45:00 crc kubenswrapper[4909]: E1201 11:45:00.153536 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153544 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4909]: E1201 11:45:00.153559 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb02b769-e281-4b1d-8bdc-b414fa58587f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153568 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb02b769-e281-4b1d-8bdc-b414fa58587f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:45:00 crc kubenswrapper[4909]: E1201 11:45:00.153613 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="extract-content" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153632 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="extract-content" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153855 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05f63ff-8f5d-4502-a78c-9cc6332f6cb7" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.153893 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb02b769-e281-4b1d-8bdc-b414fa58587f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.154624 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.157104 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.157185 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.172034 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l"] Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.236975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5d7\" (UniqueName: \"kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.237045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.237086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.338437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.338522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.338691 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5d7\" (UniqueName: \"kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.340234 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.345919 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.357756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5d7\" (UniqueName: \"kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7\") pod \"collect-profiles-29409825-8r25l\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.478815 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:00 crc kubenswrapper[4909]: I1201 11:45:00.933006 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l"] Dec 01 11:45:01 crc kubenswrapper[4909]: I1201 11:45:01.916381 4909 generic.go:334] "Generic (PLEG): container finished" podID="7b497571-a7ed-4288-a210-0f8742adb9e5" containerID="5fa680be8f0b7861708f4c6d9869f0707097df2491b3d1a4990a27c26daaafc0" exitCode=0 Dec 01 11:45:01 crc kubenswrapper[4909]: I1201 11:45:01.916464 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" event={"ID":"7b497571-a7ed-4288-a210-0f8742adb9e5","Type":"ContainerDied","Data":"5fa680be8f0b7861708f4c6d9869f0707097df2491b3d1a4990a27c26daaafc0"} Dec 01 11:45:01 crc kubenswrapper[4909]: I1201 11:45:01.916670 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" event={"ID":"7b497571-a7ed-4288-a210-0f8742adb9e5","Type":"ContainerStarted","Data":"f80dd7355dbec989fb160b227ed1b8af59a51c5f635d6b36bcd2d2ab8260d400"} Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.225740 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.290760 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume\") pod \"7b497571-a7ed-4288-a210-0f8742adb9e5\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.290861 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5d7\" (UniqueName: \"kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7\") pod \"7b497571-a7ed-4288-a210-0f8742adb9e5\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.291025 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume\") pod \"7b497571-a7ed-4288-a210-0f8742adb9e5\" (UID: \"7b497571-a7ed-4288-a210-0f8742adb9e5\") " Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.292737 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b497571-a7ed-4288-a210-0f8742adb9e5" (UID: "7b497571-a7ed-4288-a210-0f8742adb9e5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.310779 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b497571-a7ed-4288-a210-0f8742adb9e5" (UID: "7b497571-a7ed-4288-a210-0f8742adb9e5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.310954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7" (OuterVolumeSpecName: "kube-api-access-mx5d7") pod "7b497571-a7ed-4288-a210-0f8742adb9e5" (UID: "7b497571-a7ed-4288-a210-0f8742adb9e5"). InnerVolumeSpecName "kube-api-access-mx5d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.392832 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b497571-a7ed-4288-a210-0f8742adb9e5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.392902 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b497571-a7ed-4288-a210-0f8742adb9e5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.392914 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5d7\" (UniqueName: \"kubernetes.io/projected/7b497571-a7ed-4288-a210-0f8742adb9e5-kube-api-access-mx5d7\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.933860 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" event={"ID":"7b497571-a7ed-4288-a210-0f8742adb9e5","Type":"ContainerDied","Data":"f80dd7355dbec989fb160b227ed1b8af59a51c5f635d6b36bcd2d2ab8260d400"} Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.933916 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80dd7355dbec989fb160b227ed1b8af59a51c5f635d6b36bcd2d2ab8260d400" Dec 01 11:45:03 crc kubenswrapper[4909]: I1201 11:45:03.933974 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-8r25l" Dec 01 11:45:04 crc kubenswrapper[4909]: I1201 11:45:04.296110 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w"] Dec 01 11:45:04 crc kubenswrapper[4909]: I1201 11:45:04.303544 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-9cj8w"] Dec 01 11:45:05 crc kubenswrapper[4909]: I1201 11:45:05.269154 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3bf408-451e-4396-91f6-6340297bacf9" path="/var/lib/kubelet/pods/5d3bf408-451e-4396-91f6-6340297bacf9/volumes" Dec 01 11:45:13 crc kubenswrapper[4909]: I1201 11:45:13.262727 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:45:13 crc kubenswrapper[4909]: E1201 11:45:13.263603 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:45:24 crc kubenswrapper[4909]: I1201 11:45:24.256814 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:45:24 crc kubenswrapper[4909]: E1201 11:45:24.257582 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.033331 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt"] Dec 01 11:45:25 crc kubenswrapper[4909]: E1201 11:45:25.034092 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b497571-a7ed-4288-a210-0f8742adb9e5" containerName="collect-profiles" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.034118 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b497571-a7ed-4288-a210-0f8742adb9e5" containerName="collect-profiles" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.034289 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b497571-a7ed-4288-a210-0f8742adb9e5" containerName="collect-profiles" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.035072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.037380 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.037525 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.037526 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.037605 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.037437 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.043852 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt"] Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.044449 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.108849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.108901 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.108984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.109022 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.109064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.109088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4zvb\" (UniqueName: \"kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4zvb\" (UniqueName: \"kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211425 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.211510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.220577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.220595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.220650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.220735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.225297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.229526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4zvb\" (UniqueName: \"kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djfbt\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.354553 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.882928 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt"] Dec 01 11:45:25 crc kubenswrapper[4909]: I1201 11:45:25.888351 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:45:26 crc kubenswrapper[4909]: I1201 11:45:26.107448 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" event={"ID":"62961846-7ca9-4d96-8f98-84570706b555","Type":"ContainerStarted","Data":"593ab78ec0ffe7b2b1a05041b0d2349ca555db6b7b35a9fa619b6f95ed794c57"} Dec 01 11:45:27 crc kubenswrapper[4909]: I1201 11:45:27.121194 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" event={"ID":"62961846-7ca9-4d96-8f98-84570706b555","Type":"ContainerStarted","Data":"5f56d9fc7e80b276daa7ad1001ad663a1602b431253bfb37c47c4a7800358636"} Dec 01 11:45:27 crc kubenswrapper[4909]: I1201 11:45:27.145894 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" podStartSLOduration=1.571529097 podStartE2EDuration="2.14585858s" podCreationTimestamp="2025-12-01 11:45:25 +0000 UTC" firstStartedPulling="2025-12-01 11:45:25.888087754 +0000 UTC m=+4443.122558652" lastFinishedPulling="2025-12-01 11:45:26.462417237 +0000 UTC m=+4443.696888135" observedRunningTime="2025-12-01 11:45:27.140066143 +0000 UTC m=+4444.374537041" watchObservedRunningTime="2025-12-01 11:45:27.14585858 +0000 UTC m=+4444.380329478" Dec 01 11:45:38 crc kubenswrapper[4909]: I1201 11:45:38.257998 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:45:38 crc kubenswrapper[4909]: E1201 11:45:38.260139 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:45:43 crc kubenswrapper[4909]: I1201 11:45:43.645015 4909 scope.go:117] "RemoveContainer" containerID="d734f22d7d0be1b25d612f481cffce815cae07d59dcb369a78bffd7e78d285fb" Dec 01 11:45:50 crc kubenswrapper[4909]: I1201 11:45:50.257044 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:45:50 crc kubenswrapper[4909]: E1201 11:45:50.257604 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:46:05 crc kubenswrapper[4909]: I1201 11:46:05.257252 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:46:05 crc kubenswrapper[4909]: E1201 11:46:05.258051 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:46:18 crc kubenswrapper[4909]: I1201 11:46:18.256998 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:46:18 crc kubenswrapper[4909]: E1201 11:46:18.257810 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:46:33 crc kubenswrapper[4909]: I1201 11:46:33.264857 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:46:33 crc kubenswrapper[4909]: E1201 11:46:33.265837 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:46:44 crc kubenswrapper[4909]: I1201 11:46:44.257921 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:46:44 crc kubenswrapper[4909]: E1201 11:46:44.259267 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:46:56 crc kubenswrapper[4909]: I1201 11:46:56.257860 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:46:56 crc kubenswrapper[4909]: E1201 11:46:56.258715 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.257984 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:47:10 crc kubenswrapper[4909]: E1201 11:47:10.259269 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.751377 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8z7nq"] Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.753277 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.765606 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z7nq"] Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.838224 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clb8q\" (UniqueName: \"kubernetes.io/projected/c18487f9-9d29-4164-9652-052bc763a829-kube-api-access-clb8q\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.838321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-catalog-content\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.838424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-utilities\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.939773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clb8q\" (UniqueName: \"kubernetes.io/projected/c18487f9-9d29-4164-9652-052bc763a829-kube-api-access-clb8q\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.939923 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-catalog-content\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.939962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-utilities\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.940418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-catalog-content\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.940474 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18487f9-9d29-4164-9652-052bc763a829-utilities\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:10 crc kubenswrapper[4909]: I1201 11:47:10.961547 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clb8q\" (UniqueName: \"kubernetes.io/projected/c18487f9-9d29-4164-9652-052bc763a829-kube-api-access-clb8q\") pod \"community-operators-8z7nq\" (UID: \"c18487f9-9d29-4164-9652-052bc763a829\") " pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:11 crc kubenswrapper[4909]: I1201 11:47:11.091106 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:11 crc kubenswrapper[4909]: I1201 11:47:11.592622 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z7nq"] Dec 01 11:47:12 crc kubenswrapper[4909]: I1201 11:47:12.068804 4909 generic.go:334] "Generic (PLEG): container finished" podID="c18487f9-9d29-4164-9652-052bc763a829" containerID="7894a06de3ad7d11d74d2ed0c2541b87a75e831e4c46792f6c17e9293bfc140a" exitCode=0 Dec 01 11:47:12 crc kubenswrapper[4909]: I1201 11:47:12.068933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z7nq" event={"ID":"c18487f9-9d29-4164-9652-052bc763a829","Type":"ContainerDied","Data":"7894a06de3ad7d11d74d2ed0c2541b87a75e831e4c46792f6c17e9293bfc140a"} Dec 01 11:47:12 crc kubenswrapper[4909]: I1201 11:47:12.069194 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z7nq" event={"ID":"c18487f9-9d29-4164-9652-052bc763a829","Type":"ContainerStarted","Data":"bc2df17cbab8ba8ff25daaba8d282a9a86f541a7dde652759a2b76366ba79e0a"} Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.333295 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.335590 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.347233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.411801 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68b5x\" (UniqueName: \"kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.412520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.412581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.514592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.514988 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.515170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68b5x\" (UniqueName: \"kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.515288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.515563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.537489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68b5x\" (UniqueName: \"kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x\") pod \"redhat-operators-5nrqk\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:14 crc kubenswrapper[4909]: I1201 11:47:14.661763 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:16 crc kubenswrapper[4909]: I1201 11:47:16.111162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z7nq" event={"ID":"c18487f9-9d29-4164-9652-052bc763a829","Type":"ContainerStarted","Data":"9935006512e2023fb4d088b8b0caf266f580dd78f1b98629e74937c4bdc23765"} Dec 01 11:47:16 crc kubenswrapper[4909]: W1201 11:47:16.180701 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac565db6_8900_4bab_ac1f_98de9a0d52d1.slice/crio-11867267f8d1dac955fddeb15d14529a0c50b2c677994ce31ff60cf6406bbf01 WatchSource:0}: Error finding container 11867267f8d1dac955fddeb15d14529a0c50b2c677994ce31ff60cf6406bbf01: Status 404 returned error can't find the container with id 11867267f8d1dac955fddeb15d14529a0c50b2c677994ce31ff60cf6406bbf01 Dec 01 11:47:16 crc kubenswrapper[4909]: I1201 11:47:16.186064 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:17 crc kubenswrapper[4909]: I1201 11:47:17.121510 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerID="132858d830d04ffa78eea88bf63d4eb2ef5bedc846032f76bfe049042d0fcf60" exitCode=0 Dec 01 11:47:17 crc kubenswrapper[4909]: I1201 11:47:17.121599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerDied","Data":"132858d830d04ffa78eea88bf63d4eb2ef5bedc846032f76bfe049042d0fcf60"} Dec 01 11:47:17 crc kubenswrapper[4909]: I1201 11:47:17.122735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerStarted","Data":"11867267f8d1dac955fddeb15d14529a0c50b2c677994ce31ff60cf6406bbf01"} Dec 01 11:47:17 crc kubenswrapper[4909]: I1201 11:47:17.125944 4909 generic.go:334] "Generic (PLEG): container finished" podID="c18487f9-9d29-4164-9652-052bc763a829" containerID="9935006512e2023fb4d088b8b0caf266f580dd78f1b98629e74937c4bdc23765" exitCode=0 Dec 01 11:47:17 crc kubenswrapper[4909]: I1201 11:47:17.126014 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z7nq" event={"ID":"c18487f9-9d29-4164-9652-052bc763a829","Type":"ContainerDied","Data":"9935006512e2023fb4d088b8b0caf266f580dd78f1b98629e74937c4bdc23765"} Dec 01 11:47:18 crc kubenswrapper[4909]: I1201 11:47:18.138335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerStarted","Data":"4bcafc2939e0ccdf1bf77c65fc3eeb8ce009d71632d9e71fc60e9bab8933b27c"} Dec 01 11:47:19 crc kubenswrapper[4909]: I1201 11:47:19.167371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8z7nq" event={"ID":"c18487f9-9d29-4164-9652-052bc763a829","Type":"ContainerStarted","Data":"6aed7355f694f812a7ab1695a5cf2ab19794166a8f0dde545e24925ef46d6375"} Dec 01 11:47:19 crc kubenswrapper[4909]: I1201 11:47:19.192335 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8z7nq" podStartSLOduration=3.028391748 podStartE2EDuration="9.192313976s" podCreationTimestamp="2025-12-01 11:47:10 +0000 UTC" firstStartedPulling="2025-12-01 11:47:12.070649537 +0000 UTC m=+4549.305120435" lastFinishedPulling="2025-12-01 11:47:18.234571755 +0000 UTC m=+4555.469042663" observedRunningTime="2025-12-01 11:47:19.184359695 +0000 UTC m=+4556.418830593" watchObservedRunningTime="2025-12-01 11:47:19.192313976 +0000 UTC m=+4556.426784874" Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.091626 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.092004 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.143120 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.183526 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerID="4bcafc2939e0ccdf1bf77c65fc3eeb8ce009d71632d9e71fc60e9bab8933b27c" exitCode=0 Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.183626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerDied","Data":"4bcafc2939e0ccdf1bf77c65fc3eeb8ce009d71632d9e71fc60e9bab8933b27c"} Dec 01 11:47:21 crc kubenswrapper[4909]: I1201 11:47:21.259701 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:47:21 crc kubenswrapper[4909]: E1201 11:47:21.260038 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:47:22 crc kubenswrapper[4909]: I1201 11:47:22.194549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerStarted","Data":"2710dc3a5c211187681ba12c91841d56bc297fd5f599ba338b12d559ad5fc31d"} Dec 01 11:47:22 crc kubenswrapper[4909]: I1201 11:47:22.223478 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5nrqk" podStartSLOduration=3.4091763139999998 podStartE2EDuration="8.223439972s" podCreationTimestamp="2025-12-01 11:47:14 +0000 UTC" firstStartedPulling="2025-12-01 11:47:17.124148261 +0000 UTC m=+4554.358619179" lastFinishedPulling="2025-12-01 11:47:21.938411939 +0000 UTC m=+4559.172882837" observedRunningTime="2025-12-01 11:47:22.211124976 +0000 UTC m=+4559.445595904" watchObservedRunningTime="2025-12-01 11:47:22.223439972 +0000 UTC m=+4559.457910890" Dec 01 11:47:24 crc kubenswrapper[4909]: I1201 11:47:24.662965 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:24 crc kubenswrapper[4909]: I1201 11:47:24.664524 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:25 crc kubenswrapper[4909]: I1201 11:47:25.720554 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5nrqk" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="registry-server" probeResult="failure" output=< Dec 01 11:47:25 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Dec 01 11:47:25 crc kubenswrapper[4909]: > Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.135560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8z7nq" Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.220302 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8z7nq"] Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.269082 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.269499 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6mmvw" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="registry-server" containerID="cri-o://b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184" gracePeriod=2 Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.807008 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.961888 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities\") pod \"c359361c-f3c8-4383-9738-f3858ef23e33\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.961967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content\") pod \"c359361c-f3c8-4383-9738-f3858ef23e33\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.962039 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgq4n\" (UniqueName: \"kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n\") pod \"c359361c-f3c8-4383-9738-f3858ef23e33\" (UID: \"c359361c-f3c8-4383-9738-f3858ef23e33\") " Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.964384 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities" (OuterVolumeSpecName: "utilities") pod "c359361c-f3c8-4383-9738-f3858ef23e33" (UID: "c359361c-f3c8-4383-9738-f3858ef23e33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:31 crc kubenswrapper[4909]: I1201 11:47:31.974281 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n" (OuterVolumeSpecName: "kube-api-access-jgq4n") pod "c359361c-f3c8-4383-9738-f3858ef23e33" (UID: "c359361c-f3c8-4383-9738-f3858ef23e33"). InnerVolumeSpecName "kube-api-access-jgq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.033169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c359361c-f3c8-4383-9738-f3858ef23e33" (UID: "c359361c-f3c8-4383-9738-f3858ef23e33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.064687 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.064759 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c359361c-f3c8-4383-9738-f3858ef23e33-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.064773 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgq4n\" (UniqueName: \"kubernetes.io/projected/c359361c-f3c8-4383-9738-f3858ef23e33-kube-api-access-jgq4n\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.288626 4909 generic.go:334] "Generic (PLEG): container finished" podID="c359361c-f3c8-4383-9738-f3858ef23e33" containerID="b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184" exitCode=0 Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.288698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerDied","Data":"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184"} Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.288737 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mmvw" event={"ID":"c359361c-f3c8-4383-9738-f3858ef23e33","Type":"ContainerDied","Data":"67900faa1ef0559978584007756ddcb44ee5794e574d1cfc06b996dcbaf23365"} Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.288762 4909 scope.go:117] "RemoveContainer" containerID="b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.288798 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mmvw" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.312656 4909 scope.go:117] "RemoveContainer" containerID="3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.328016 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.334523 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6mmvw"] Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.345072 4909 scope.go:117] "RemoveContainer" containerID="c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.373089 4909 scope.go:117] "RemoveContainer" containerID="b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184" Dec 01 11:47:32 crc kubenswrapper[4909]: E1201 11:47:32.373459 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184\": container with ID starting with b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184 not found: ID does not exist" containerID="b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.373499 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184"} err="failed to get container status \"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184\": rpc error: code = NotFound desc = could not find container \"b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184\": container with ID starting with b66bf8f43fa2df4bff5827a40c79e8a7c85234d499e4482e2155800079125184 not found: ID does not exist" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.373524 4909 scope.go:117] "RemoveContainer" containerID="3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7" Dec 01 11:47:32 crc kubenswrapper[4909]: E1201 11:47:32.373753 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7\": container with ID starting with 3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7 not found: ID does not exist" containerID="3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.373817 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7"} err="failed to get container status \"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7\": rpc error: code = NotFound desc = could not find container \"3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7\": container with ID starting with 3bdc13056dbb1250867899e7b571b49523c59ac41a1088d11de436f7191fd5d7 not found: ID does not exist" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.373839 4909 scope.go:117] "RemoveContainer" containerID="c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3" Dec 01 11:47:32 crc kubenswrapper[4909]: E1201 11:47:32.374158 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3\": container with ID starting with c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3 not found: ID does not exist" containerID="c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3" Dec 01 11:47:32 crc kubenswrapper[4909]: I1201 11:47:32.374181 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3"} err="failed to get container status \"c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3\": rpc error: code = NotFound desc = could not find container \"c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3\": container with ID starting with c4362b7ba81dcc46076ac621841723d8b8506a1e64b16527c169d98e3941c9e3 not found: ID does not exist" Dec 01 11:47:33 crc kubenswrapper[4909]: I1201 11:47:33.268586 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:47:33 crc kubenswrapper[4909]: E1201 11:47:33.268862 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:47:33 crc kubenswrapper[4909]: I1201 11:47:33.275210 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" path="/var/lib/kubelet/pods/c359361c-f3c8-4383-9738-f3858ef23e33/volumes" Dec 01 11:47:34 crc kubenswrapper[4909]: I1201 11:47:34.709750 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:34 crc kubenswrapper[4909]: I1201 11:47:34.762685 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:36 crc kubenswrapper[4909]: I1201 11:47:36.971705 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:36 crc kubenswrapper[4909]: I1201 11:47:36.972305 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5nrqk" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="registry-server" containerID="cri-o://2710dc3a5c211187681ba12c91841d56bc297fd5f599ba338b12d559ad5fc31d" gracePeriod=2 Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.346202 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerID="2710dc3a5c211187681ba12c91841d56bc297fd5f599ba338b12d559ad5fc31d" exitCode=0 Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.346252 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerDied","Data":"2710dc3a5c211187681ba12c91841d56bc297fd5f599ba338b12d559ad5fc31d"} Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.810555 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.877389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68b5x\" (UniqueName: \"kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x\") pod \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.877439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content\") pod \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.877460 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities\") pod \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\" (UID: \"ac565db6-8900-4bab-ac1f-98de9a0d52d1\") " Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.878394 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities" (OuterVolumeSpecName: "utilities") pod "ac565db6-8900-4bab-ac1f-98de9a0d52d1" (UID: "ac565db6-8900-4bab-ac1f-98de9a0d52d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.889142 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x" (OuterVolumeSpecName: "kube-api-access-68b5x") pod "ac565db6-8900-4bab-ac1f-98de9a0d52d1" (UID: "ac565db6-8900-4bab-ac1f-98de9a0d52d1"). InnerVolumeSpecName "kube-api-access-68b5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.979672 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.979710 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68b5x\" (UniqueName: \"kubernetes.io/projected/ac565db6-8900-4bab-ac1f-98de9a0d52d1-kube-api-access-68b5x\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:37 crc kubenswrapper[4909]: I1201 11:47:37.996089 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac565db6-8900-4bab-ac1f-98de9a0d52d1" (UID: "ac565db6-8900-4bab-ac1f-98de9a0d52d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.081022 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac565db6-8900-4bab-ac1f-98de9a0d52d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.355950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nrqk" event={"ID":"ac565db6-8900-4bab-ac1f-98de9a0d52d1","Type":"ContainerDied","Data":"11867267f8d1dac955fddeb15d14529a0c50b2c677994ce31ff60cf6406bbf01"} Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.356003 4909 scope.go:117] "RemoveContainer" containerID="2710dc3a5c211187681ba12c91841d56bc297fd5f599ba338b12d559ad5fc31d" Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.356045 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nrqk" Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.377082 4909 scope.go:117] "RemoveContainer" containerID="4bcafc2939e0ccdf1bf77c65fc3eeb8ce009d71632d9e71fc60e9bab8933b27c" Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.395287 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.405329 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5nrqk"] Dec 01 11:47:38 crc kubenswrapper[4909]: I1201 11:47:38.407111 4909 scope.go:117] "RemoveContainer" containerID="132858d830d04ffa78eea88bf63d4eb2ef5bedc846032f76bfe049042d0fcf60" Dec 01 11:47:39 crc kubenswrapper[4909]: I1201 11:47:39.268158 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" path="/var/lib/kubelet/pods/ac565db6-8900-4bab-ac1f-98de9a0d52d1/volumes" Dec 01 11:47:47 crc kubenswrapper[4909]: I1201 11:47:47.257561 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:47:48 crc kubenswrapper[4909]: I1201 11:47:48.449120 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee"} Dec 01 11:50:06 crc kubenswrapper[4909]: I1201 11:50:06.193896 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:50:06 crc kubenswrapper[4909]: I1201 11:50:06.194431 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:50:36 crc kubenswrapper[4909]: I1201 11:50:36.193410 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:50:36 crc kubenswrapper[4909]: I1201 11:50:36.193806 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:51:06 crc kubenswrapper[4909]: I1201 11:51:06.194029 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:51:06 crc kubenswrapper[4909]: I1201 11:51:06.195769 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:51:06 crc kubenswrapper[4909]: I1201 11:51:06.195921 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:51:06 crc kubenswrapper[4909]: I1201 11:51:06.196646 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:51:06 crc kubenswrapper[4909]: I1201 11:51:06.196807 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee" gracePeriod=600 Dec 01 11:51:07 crc kubenswrapper[4909]: I1201 11:51:07.178686 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee" exitCode=0 Dec 01 11:51:07 crc kubenswrapper[4909]: I1201 11:51:07.178773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee"} Dec 01 11:51:07 crc kubenswrapper[4909]: I1201 11:51:07.179218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807"} Dec 01 11:51:07 crc kubenswrapper[4909]: I1201 11:51:07.179239 4909 scope.go:117] "RemoveContainer" containerID="b395d1653647a790bf68e4fc929d1ff940ccc0a1d2fc6aabfa64ae914155f8a2" Dec 01 11:51:15 crc kubenswrapper[4909]: I1201 11:51:15.251174 4909 generic.go:334] "Generic (PLEG): container finished" podID="62961846-7ca9-4d96-8f98-84570706b555" containerID="5f56d9fc7e80b276daa7ad1001ad663a1602b431253bfb37c47c4a7800358636" exitCode=2 Dec 01 11:51:15 crc kubenswrapper[4909]: I1201 11:51:15.251245 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" event={"ID":"62961846-7ca9-4d96-8f98-84570706b555","Type":"ContainerDied","Data":"5f56d9fc7e80b276daa7ad1001ad663a1602b431253bfb37c47c4a7800358636"} Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.680905 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809418 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809470 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809608 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809646 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4zvb\" (UniqueName: \"kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.809693 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph\") pod \"62961846-7ca9-4d96-8f98-84570706b555\" (UID: \"62961846-7ca9-4d96-8f98-84570706b555\") " Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.816741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph" (OuterVolumeSpecName: "ceph") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.817610 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb" (OuterVolumeSpecName: "kube-api-access-v4zvb") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "kube-api-access-v4zvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.820647 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.839485 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.839647 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory" (OuterVolumeSpecName: "inventory") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.844531 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62961846-7ca9-4d96-8f98-84570706b555" (UID: "62961846-7ca9-4d96-8f98-84570706b555"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912492 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912559 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912581 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912598 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912621 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4zvb\" (UniqueName: \"kubernetes.io/projected/62961846-7ca9-4d96-8f98-84570706b555-kube-api-access-v4zvb\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:16 crc kubenswrapper[4909]: I1201 11:51:16.912639 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62961846-7ca9-4d96-8f98-84570706b555-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:17 crc kubenswrapper[4909]: I1201 11:51:17.270726 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" Dec 01 11:51:17 crc kubenswrapper[4909]: I1201 11:51:17.275317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djfbt" event={"ID":"62961846-7ca9-4d96-8f98-84570706b555","Type":"ContainerDied","Data":"593ab78ec0ffe7b2b1a05041b0d2349ca555db6b7b35a9fa619b6f95ed794c57"} Dec 01 11:51:17 crc kubenswrapper[4909]: I1201 11:51:17.275360 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593ab78ec0ffe7b2b1a05041b0d2349ca555db6b7b35a9fa619b6f95ed794c57" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.549435 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="extract-content" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550627 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="extract-content" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550658 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="extract-utilities" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550673 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="extract-utilities" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550698 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62961846-7ca9-4d96-8f98-84570706b555" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550710 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="62961846-7ca9-4d96-8f98-84570706b555" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550730 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550742 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550758 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="extract-utilities" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550769 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="extract-utilities" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550793 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="extract-content" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550804 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="extract-content" Dec 01 11:51:59 crc kubenswrapper[4909]: E1201 11:51:59.550831 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.550842 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.553782 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac565db6-8900-4bab-ac1f-98de9a0d52d1" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.553849 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c359361c-f3c8-4383-9738-f3858ef23e33" containerName="registry-server" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.553868 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="62961846-7ca9-4d96-8f98-84570706b555" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.556203 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.566597 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.566721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnb6d\" (UniqueName: \"kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.566755 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.566784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.668259 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnb6d\" (UniqueName: \"kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.668305 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.668332 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.668921 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.669156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.688050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnb6d\" (UniqueName: \"kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d\") pod \"certified-operators-525z5\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.748753 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.750971 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.760856 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.871662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.871785 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf775\" (UniqueName: \"kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.871913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.893332 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.973720 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf775\" (UniqueName: \"kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.974183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.974297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.974700 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:51:59 crc kubenswrapper[4909]: I1201 11:51:59.974754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.005772 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf775\" (UniqueName: \"kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775\") pod \"redhat-marketplace-nqtxp\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.080073 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.444910 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.631730 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:52:00 crc kubenswrapper[4909]: W1201 11:52:00.636649 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4a09f1a_fdcc_4419_8758_a59a35eff793.slice/crio-5b27f1c939ad2613e5d4757639a89cc094a62cf07cd5a67e49f65f35caaf57be WatchSource:0}: Error finding container 5b27f1c939ad2613e5d4757639a89cc094a62cf07cd5a67e49f65f35caaf57be: Status 404 returned error can't find the container with id 5b27f1c939ad2613e5d4757639a89cc094a62cf07cd5a67e49f65f35caaf57be Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.698112 4909 generic.go:334] "Generic (PLEG): container finished" podID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerID="344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653" exitCode=0 Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.698186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerDied","Data":"344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653"} Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.698213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerStarted","Data":"d4eaa4e6612ceb9c163da05c32394b45384d8357daf32b661d2028d76ea2c5a2"} Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.699739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerStarted","Data":"5b27f1c939ad2613e5d4757639a89cc094a62cf07cd5a67e49f65f35caaf57be"} Dec 01 11:52:00 crc kubenswrapper[4909]: I1201 11:52:00.699997 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:52:01 crc kubenswrapper[4909]: I1201 11:52:01.735596 4909 generic.go:334] "Generic (PLEG): container finished" podID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerID="61b045190851f6cb3e8ded7e0a5a82e1262b6f17d0b722cb198b9d461c89bda2" exitCode=0 Dec 01 11:52:01 crc kubenswrapper[4909]: I1201 11:52:01.735981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerDied","Data":"61b045190851f6cb3e8ded7e0a5a82e1262b6f17d0b722cb198b9d461c89bda2"} Dec 01 11:52:02 crc kubenswrapper[4909]: I1201 11:52:02.746032 4909 generic.go:334] "Generic (PLEG): container finished" podID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerID="7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf" exitCode=0 Dec 01 11:52:02 crc kubenswrapper[4909]: I1201 11:52:02.746156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerDied","Data":"7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf"} Dec 01 11:52:02 crc kubenswrapper[4909]: I1201 11:52:02.751977 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerStarted","Data":"dab2000233aa7622e1d6889b5872b9f17784eb52417c51ea78c8b22f7d6baad1"} Dec 01 11:52:03 crc kubenswrapper[4909]: I1201 11:52:03.762334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerStarted","Data":"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a"} Dec 01 11:52:03 crc kubenswrapper[4909]: I1201 11:52:03.764645 4909 generic.go:334] "Generic (PLEG): container finished" podID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerID="dab2000233aa7622e1d6889b5872b9f17784eb52417c51ea78c8b22f7d6baad1" exitCode=0 Dec 01 11:52:03 crc kubenswrapper[4909]: I1201 11:52:03.764677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerDied","Data":"dab2000233aa7622e1d6889b5872b9f17784eb52417c51ea78c8b22f7d6baad1"} Dec 01 11:52:03 crc kubenswrapper[4909]: I1201 11:52:03.791947 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-525z5" podStartSLOduration=2.176619801 podStartE2EDuration="4.791923242s" podCreationTimestamp="2025-12-01 11:51:59 +0000 UTC" firstStartedPulling="2025-12-01 11:52:00.699690527 +0000 UTC m=+4837.934161425" lastFinishedPulling="2025-12-01 11:52:03.314993968 +0000 UTC m=+4840.549464866" observedRunningTime="2025-12-01 11:52:03.787231525 +0000 UTC m=+4841.021702443" watchObservedRunningTime="2025-12-01 11:52:03.791923242 +0000 UTC m=+4841.026394150" Dec 01 11:52:04 crc kubenswrapper[4909]: I1201 11:52:04.775016 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerStarted","Data":"0f4c6f0114980185eb04cb24afd9b386311df1f8a84fe061ce207020ee1bb444"} Dec 01 11:52:04 crc kubenswrapper[4909]: I1201 11:52:04.804218 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nqtxp" podStartSLOduration=3.408316379 podStartE2EDuration="5.804190774s" podCreationTimestamp="2025-12-01 11:51:59 +0000 UTC" firstStartedPulling="2025-12-01 11:52:01.743703855 +0000 UTC m=+4838.978174763" lastFinishedPulling="2025-12-01 11:52:04.13957826 +0000 UTC m=+4841.374049158" observedRunningTime="2025-12-01 11:52:04.792218878 +0000 UTC m=+4842.026689786" watchObservedRunningTime="2025-12-01 11:52:04.804190774 +0000 UTC m=+4842.038661692" Dec 01 11:52:09 crc kubenswrapper[4909]: I1201 11:52:09.894132 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:09 crc kubenswrapper[4909]: I1201 11:52:09.894748 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:09 crc kubenswrapper[4909]: I1201 11:52:09.945597 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:10 crc kubenswrapper[4909]: I1201 11:52:10.081528 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:10 crc kubenswrapper[4909]: I1201 11:52:10.081595 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:10 crc kubenswrapper[4909]: I1201 11:52:10.130332 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:10 crc kubenswrapper[4909]: I1201 11:52:10.870889 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:10 crc kubenswrapper[4909]: I1201 11:52:10.876657 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:12 crc kubenswrapper[4909]: I1201 11:52:12.533915 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:52:12 crc kubenswrapper[4909]: I1201 11:52:12.834378 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nqtxp" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="registry-server" containerID="cri-o://0f4c6f0114980185eb04cb24afd9b386311df1f8a84fe061ce207020ee1bb444" gracePeriod=2 Dec 01 11:52:13 crc kubenswrapper[4909]: I1201 11:52:13.142495 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:52:13 crc kubenswrapper[4909]: I1201 11:52:13.142842 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-525z5" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="registry-server" containerID="cri-o://39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a" gracePeriod=2 Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.607914 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.731337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities\") pod \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.731785 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content\") pod \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.732205 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities" (OuterVolumeSpecName: "utilities") pod "0d0911a9-c294-46e4-9fc4-8f5e44676f94" (UID: "0d0911a9-c294-46e4-9fc4-8f5e44676f94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.731868 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnb6d\" (UniqueName: \"kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d\") pod \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\" (UID: \"0d0911a9-c294-46e4-9fc4-8f5e44676f94\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.733532 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.737474 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d" (OuterVolumeSpecName: "kube-api-access-dnb6d") pod "0d0911a9-c294-46e4-9fc4-8f5e44676f94" (UID: "0d0911a9-c294-46e4-9fc4-8f5e44676f94"). InnerVolumeSpecName "kube-api-access-dnb6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.807134 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d0911a9-c294-46e4-9fc4-8f5e44676f94" (UID: "0d0911a9-c294-46e4-9fc4-8f5e44676f94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.835098 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d0911a9-c294-46e4-9fc4-8f5e44676f94-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.835129 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnb6d\" (UniqueName: \"kubernetes.io/projected/0d0911a9-c294-46e4-9fc4-8f5e44676f94-kube-api-access-dnb6d\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.867567 4909 generic.go:334] "Generic (PLEG): container finished" podID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerID="39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a" exitCode=0 Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.867651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerDied","Data":"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a"} Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.867695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-525z5" event={"ID":"0d0911a9-c294-46e4-9fc4-8f5e44676f94","Type":"ContainerDied","Data":"d4eaa4e6612ceb9c163da05c32394b45384d8357daf32b661d2028d76ea2c5a2"} Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.867711 4909 scope.go:117] "RemoveContainer" containerID="39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.867903 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-525z5" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.890824 4909 generic.go:334] "Generic (PLEG): container finished" podID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerID="0f4c6f0114980185eb04cb24afd9b386311df1f8a84fe061ce207020ee1bb444" exitCode=0 Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.890890 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerDied","Data":"0f4c6f0114980185eb04cb24afd9b386311df1f8a84fe061ce207020ee1bb444"} Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.912089 4909 scope.go:117] "RemoveContainer" containerID="7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.956955 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:13.985518 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-525z5"] Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.048143 4909 scope.go:117] "RemoveContainer" containerID="344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.072850 4909 scope.go:117] "RemoveContainer" containerID="39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a" Dec 01 11:52:14 crc kubenswrapper[4909]: E1201 11:52:14.073399 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a\": container with ID starting with 39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a not found: ID does not exist" containerID="39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.073442 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a"} err="failed to get container status \"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a\": rpc error: code = NotFound desc = could not find container \"39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a\": container with ID starting with 39f8fe62e555ce6888022e762a2b799f63e160d4bfbab6862063ddf59038319a not found: ID does not exist" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.073473 4909 scope.go:117] "RemoveContainer" containerID="7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf" Dec 01 11:52:14 crc kubenswrapper[4909]: E1201 11:52:14.074000 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf\": container with ID starting with 7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf not found: ID does not exist" containerID="7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.074044 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf"} err="failed to get container status \"7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf\": rpc error: code = NotFound desc = could not find container \"7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf\": container with ID starting with 7383c7ff5a5d82074b23aaa5d564f01e2d44e06b2d5e053ad9371f974f403fdf not found: ID does not exist" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.074076 4909 scope.go:117] "RemoveContainer" containerID="344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653" Dec 01 11:52:14 crc kubenswrapper[4909]: E1201 11:52:14.076807 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653\": container with ID starting with 344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653 not found: ID does not exist" containerID="344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.076846 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653"} err="failed to get container status \"344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653\": rpc error: code = NotFound desc = could not find container \"344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653\": container with ID starting with 344cef198d5041cd37e2e5bb33e8fa27813319aabe9a166f19db5f7464a6f653 not found: ID does not exist" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.817952 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.900640 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtxp" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.900635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtxp" event={"ID":"c4a09f1a-fdcc-4419-8758-a59a35eff793","Type":"ContainerDied","Data":"5b27f1c939ad2613e5d4757639a89cc094a62cf07cd5a67e49f65f35caaf57be"} Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.900833 4909 scope.go:117] "RemoveContainer" containerID="0f4c6f0114980185eb04cb24afd9b386311df1f8a84fe061ce207020ee1bb444" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.920975 4909 scope.go:117] "RemoveContainer" containerID="dab2000233aa7622e1d6889b5872b9f17784eb52417c51ea78c8b22f7d6baad1" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.944998 4909 scope.go:117] "RemoveContainer" containerID="61b045190851f6cb3e8ded7e0a5a82e1262b6f17d0b722cb198b9d461c89bda2" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.952592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content\") pod \"c4a09f1a-fdcc-4419-8758-a59a35eff793\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.952883 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf775\" (UniqueName: \"kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775\") pod \"c4a09f1a-fdcc-4419-8758-a59a35eff793\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.952925 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities\") pod \"c4a09f1a-fdcc-4419-8758-a59a35eff793\" (UID: \"c4a09f1a-fdcc-4419-8758-a59a35eff793\") " Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.954075 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities" (OuterVolumeSpecName: "utilities") pod "c4a09f1a-fdcc-4419-8758-a59a35eff793" (UID: "c4a09f1a-fdcc-4419-8758-a59a35eff793"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.958134 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775" (OuterVolumeSpecName: "kube-api-access-kf775") pod "c4a09f1a-fdcc-4419-8758-a59a35eff793" (UID: "c4a09f1a-fdcc-4419-8758-a59a35eff793"). InnerVolumeSpecName "kube-api-access-kf775". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:14 crc kubenswrapper[4909]: I1201 11:52:14.972217 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4a09f1a-fdcc-4419-8758-a59a35eff793" (UID: "c4a09f1a-fdcc-4419-8758-a59a35eff793"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.055173 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.055212 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a09f1a-fdcc-4419-8758-a59a35eff793-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.055228 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf775\" (UniqueName: \"kubernetes.io/projected/c4a09f1a-fdcc-4419-8758-a59a35eff793-kube-api-access-kf775\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.230676 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.240179 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtxp"] Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.267926 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" path="/var/lib/kubelet/pods/0d0911a9-c294-46e4-9fc4-8f5e44676f94/volumes" Dec 01 11:52:15 crc kubenswrapper[4909]: I1201 11:52:15.268710 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" path="/var/lib/kubelet/pods/c4a09f1a-fdcc-4419-8758-a59a35eff793/volumes" Dec 01 11:53:06 crc kubenswrapper[4909]: I1201 11:53:06.193254 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:53:06 crc kubenswrapper[4909]: I1201 11:53:06.193842 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:53:36 crc kubenswrapper[4909]: I1201 11:53:36.193693 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:53:36 crc kubenswrapper[4909]: I1201 11:53:36.194282 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.193610 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.194157 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.194206 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.195051 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.195106 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" gracePeriod=600 Dec 01 11:54:06 crc kubenswrapper[4909]: E1201 11:54:06.315889 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.861655 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" exitCode=0 Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.861762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807"} Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.862048 4909 scope.go:117] "RemoveContainer" containerID="f95bf45543b4dac8307be2477f4df53e5276c3bbf60164c22c073a1368e361ee" Dec 01 11:54:06 crc kubenswrapper[4909]: I1201 11:54:06.862846 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:54:06 crc kubenswrapper[4909]: E1201 11:54:06.863421 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:54:21 crc kubenswrapper[4909]: I1201 11:54:21.257846 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:54:21 crc kubenswrapper[4909]: E1201 11:54:21.258671 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:54:34 crc kubenswrapper[4909]: I1201 11:54:34.257526 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:54:34 crc kubenswrapper[4909]: E1201 11:54:34.258305 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:54:46 crc kubenswrapper[4909]: I1201 11:54:46.256497 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:54:46 crc kubenswrapper[4909]: E1201 11:54:46.257412 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:55:01 crc kubenswrapper[4909]: I1201 11:55:01.257229 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:55:01 crc kubenswrapper[4909]: E1201 11:55:01.258107 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:55:15 crc kubenswrapper[4909]: I1201 11:55:15.257062 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:55:15 crc kubenswrapper[4909]: E1201 11:55:15.257749 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:55:26 crc kubenswrapper[4909]: I1201 11:55:26.258196 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:55:26 crc kubenswrapper[4909]: E1201 11:55:26.260088 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:55:38 crc kubenswrapper[4909]: I1201 11:55:38.258588 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:55:38 crc kubenswrapper[4909]: E1201 11:55:38.259546 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:55:52 crc kubenswrapper[4909]: I1201 11:55:52.258626 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:55:52 crc kubenswrapper[4909]: E1201 11:55:52.259454 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:56:03 crc kubenswrapper[4909]: I1201 11:56:03.270905 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:56:03 crc kubenswrapper[4909]: E1201 11:56:03.272291 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:56:18 crc kubenswrapper[4909]: I1201 11:56:18.257730 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:56:18 crc kubenswrapper[4909]: E1201 11:56:18.258584 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:56:31 crc kubenswrapper[4909]: I1201 11:56:31.260391 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:56:31 crc kubenswrapper[4909]: E1201 11:56:31.261102 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.026289 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf"] Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027428 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027451 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027472 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="extract-content" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027479 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="extract-content" Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027492 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027500 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027516 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="extract-utilities" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027522 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="extract-utilities" Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027534 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="extract-utilities" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027540 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="extract-utilities" Dec 01 11:56:34 crc kubenswrapper[4909]: E1201 11:56:34.027558 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="extract-content" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027565 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="extract-content" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027735 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a09f1a-fdcc-4419-8758-a59a35eff793" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.027755 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d0911a9-c294-46e4-9fc4-8f5e44676f94" containerName="registry-server" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.028535 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.030366 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.030724 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.031093 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-572jv" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.031834 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.032667 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.033845 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.046076 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf"] Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmjqk\" (UniqueName: \"kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100483 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100650 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.100884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.202861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.202936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.203025 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.203077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.203136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.203166 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmjqk\" (UniqueName: \"kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.209486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.209678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.209686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.209733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.220597 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.224437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmjqk\" (UniqueName: \"kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.349030 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 11:56:34 crc kubenswrapper[4909]: I1201 11:56:34.877089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf"] Dec 01 11:56:35 crc kubenswrapper[4909]: I1201 11:56:35.053747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" event={"ID":"50803ad1-ed73-4df3-bddc-6d8fd12aa087","Type":"ContainerStarted","Data":"bfbdf42d200db9bf41d5be64ac776aeb3c56ef1bf1b4b3b0bdc514e678118d0f"} Dec 01 11:56:36 crc kubenswrapper[4909]: I1201 11:56:36.066292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" event={"ID":"50803ad1-ed73-4df3-bddc-6d8fd12aa087","Type":"ContainerStarted","Data":"33532eff699c87e2f8dd804f4a287112f8972e1f2ed33d6038036e3c163cd387"} Dec 01 11:56:36 crc kubenswrapper[4909]: I1201 11:56:36.102002 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" podStartSLOduration=1.463823794 podStartE2EDuration="2.101976141s" podCreationTimestamp="2025-12-01 11:56:34 +0000 UTC" firstStartedPulling="2025-12-01 11:56:34.878573494 +0000 UTC m=+5112.113044392" lastFinishedPulling="2025-12-01 11:56:35.516725841 +0000 UTC m=+5112.751196739" observedRunningTime="2025-12-01 11:56:36.089933464 +0000 UTC m=+5113.324404372" watchObservedRunningTime="2025-12-01 11:56:36.101976141 +0000 UTC m=+5113.336447059" Dec 01 11:56:46 crc kubenswrapper[4909]: I1201 11:56:46.258243 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:56:46 crc kubenswrapper[4909]: E1201 11:56:46.259557 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:01 crc kubenswrapper[4909]: I1201 11:57:01.258014 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:57:01 crc kubenswrapper[4909]: E1201 11:57:01.258918 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:15 crc kubenswrapper[4909]: I1201 11:57:15.256851 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:57:15 crc kubenswrapper[4909]: E1201 11:57:15.257536 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:29 crc kubenswrapper[4909]: I1201 11:57:29.256952 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:57:29 crc kubenswrapper[4909]: E1201 11:57:29.257657 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:41 crc kubenswrapper[4909]: I1201 11:57:41.257241 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:57:41 crc kubenswrapper[4909]: E1201 11:57:41.258057 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:55 crc kubenswrapper[4909]: I1201 11:57:55.257023 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:57:55 crc kubenswrapper[4909]: E1201 11:57:55.257870 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.130644 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.134071 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.139233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.186649 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qkm\" (UniqueName: \"kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.187240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.187537 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.289917 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84qkm\" (UniqueName: \"kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.290207 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.290350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.290841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.290928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.309546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qkm\" (UniqueName: \"kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm\") pod \"community-operators-wvxc6\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.454286 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:57:57 crc kubenswrapper[4909]: I1201 11:57:57.939661 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:57:58 crc kubenswrapper[4909]: I1201 11:57:58.784522 4909 generic.go:334] "Generic (PLEG): container finished" podID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerID="526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74" exitCode=0 Dec 01 11:57:58 crc kubenswrapper[4909]: I1201 11:57:58.784839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerDied","Data":"526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74"} Dec 01 11:57:58 crc kubenswrapper[4909]: I1201 11:57:58.784903 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerStarted","Data":"37943ee8b2490a02ca21cc1918431cd6e82b18192fb7b911402384050eef1a91"} Dec 01 11:57:58 crc kubenswrapper[4909]: I1201 11:57:58.787337 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:58:00 crc kubenswrapper[4909]: I1201 11:58:00.805268 4909 generic.go:334] "Generic (PLEG): container finished" podID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerID="c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6" exitCode=0 Dec 01 11:58:00 crc kubenswrapper[4909]: I1201 11:58:00.805681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerDied","Data":"c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6"} Dec 01 11:58:02 crc kubenswrapper[4909]: I1201 11:58:02.827093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerStarted","Data":"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac"} Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.454469 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.455094 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.540392 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.568061 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wvxc6" podStartSLOduration=7.622471253 podStartE2EDuration="10.568036688s" podCreationTimestamp="2025-12-01 11:57:57 +0000 UTC" firstStartedPulling="2025-12-01 11:57:58.787144611 +0000 UTC m=+5196.021615509" lastFinishedPulling="2025-12-01 11:58:01.732710046 +0000 UTC m=+5198.967180944" observedRunningTime="2025-12-01 11:58:02.853480178 +0000 UTC m=+5200.087951076" watchObservedRunningTime="2025-12-01 11:58:07.568036688 +0000 UTC m=+5204.802507586" Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.921086 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:07 crc kubenswrapper[4909]: I1201 11:58:07.982567 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:58:08 crc kubenswrapper[4909]: I1201 11:58:08.257427 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:58:08 crc kubenswrapper[4909]: E1201 11:58:08.257664 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:58:09 crc kubenswrapper[4909]: I1201 11:58:09.900070 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wvxc6" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="registry-server" containerID="cri-o://d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac" gracePeriod=2 Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.496774 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.696362 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content\") pod \"ad38d3ef-1bf8-43d2-b067-208976b9b583\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.697064 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84qkm\" (UniqueName: \"kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm\") pod \"ad38d3ef-1bf8-43d2-b067-208976b9b583\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.697125 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities\") pod \"ad38d3ef-1bf8-43d2-b067-208976b9b583\" (UID: \"ad38d3ef-1bf8-43d2-b067-208976b9b583\") " Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.698447 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities" (OuterVolumeSpecName: "utilities") pod "ad38d3ef-1bf8-43d2-b067-208976b9b583" (UID: "ad38d3ef-1bf8-43d2-b067-208976b9b583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.715470 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm" (OuterVolumeSpecName: "kube-api-access-84qkm") pod "ad38d3ef-1bf8-43d2-b067-208976b9b583" (UID: "ad38d3ef-1bf8-43d2-b067-208976b9b583"). InnerVolumeSpecName "kube-api-access-84qkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.747839 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad38d3ef-1bf8-43d2-b067-208976b9b583" (UID: "ad38d3ef-1bf8-43d2-b067-208976b9b583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.799665 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.799707 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84qkm\" (UniqueName: \"kubernetes.io/projected/ad38d3ef-1bf8-43d2-b067-208976b9b583-kube-api-access-84qkm\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.799719 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad38d3ef-1bf8-43d2-b067-208976b9b583-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.910493 4909 generic.go:334] "Generic (PLEG): container finished" podID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerID="d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac" exitCode=0 Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.910561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerDied","Data":"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac"} Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.910619 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvxc6" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.911423 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvxc6" event={"ID":"ad38d3ef-1bf8-43d2-b067-208976b9b583","Type":"ContainerDied","Data":"37943ee8b2490a02ca21cc1918431cd6e82b18192fb7b911402384050eef1a91"} Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.911441 4909 scope.go:117] "RemoveContainer" containerID="d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.934731 4909 scope.go:117] "RemoveContainer" containerID="c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6" Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.956510 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.964961 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wvxc6"] Dec 01 11:58:10 crc kubenswrapper[4909]: I1201 11:58:10.977000 4909 scope.go:117] "RemoveContainer" containerID="526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.000434 4909 scope.go:117] "RemoveContainer" containerID="d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac" Dec 01 11:58:11 crc kubenswrapper[4909]: E1201 11:58:11.000994 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac\": container with ID starting with d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac not found: ID does not exist" containerID="d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.001038 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac"} err="failed to get container status \"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac\": rpc error: code = NotFound desc = could not find container \"d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac\": container with ID starting with d120cdf068ce72877bab73ae304dec90a99a515557ec67952bda061984a268ac not found: ID does not exist" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.001072 4909 scope.go:117] "RemoveContainer" containerID="c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6" Dec 01 11:58:11 crc kubenswrapper[4909]: E1201 11:58:11.001500 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6\": container with ID starting with c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6 not found: ID does not exist" containerID="c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.001548 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6"} err="failed to get container status \"c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6\": rpc error: code = NotFound desc = could not find container \"c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6\": container with ID starting with c469e2bbbdaa8bcd6b701fa0e9880317f0f71fb84442b9606464ca9ca4e593f6 not found: ID does not exist" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.001577 4909 scope.go:117] "RemoveContainer" containerID="526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74" Dec 01 11:58:11 crc kubenswrapper[4909]: E1201 11:58:11.001930 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74\": container with ID starting with 526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74 not found: ID does not exist" containerID="526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.001952 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74"} err="failed to get container status \"526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74\": rpc error: code = NotFound desc = could not find container \"526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74\": container with ID starting with 526b22ae15090beb36bd2fa396c2ede54724c4a351df10c63be602df30acfa74 not found: ID does not exist" Dec 01 11:58:11 crc kubenswrapper[4909]: I1201 11:58:11.267726 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" path="/var/lib/kubelet/pods/ad38d3ef-1bf8-43d2-b067-208976b9b583/volumes" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.915958 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:15 crc kubenswrapper[4909]: E1201 11:58:15.920313 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="extract-utilities" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.920335 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="extract-utilities" Dec 01 11:58:15 crc kubenswrapper[4909]: E1201 11:58:15.920352 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="extract-content" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.920359 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="extract-content" Dec 01 11:58:15 crc kubenswrapper[4909]: E1201 11:58:15.920370 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="registry-server" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.920376 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="registry-server" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.920579 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad38d3ef-1bf8-43d2-b067-208976b9b583" containerName="registry-server" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.921965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:15 crc kubenswrapper[4909]: I1201 11:58:15.938263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.003953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76945\" (UniqueName: \"kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.004049 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.004095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.105803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76945\" (UniqueName: \"kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.105910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.105939 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.106949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.107117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.130642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76945\" (UniqueName: \"kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945\") pod \"redhat-operators-4wlrw\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.291090 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.763233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.992500 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerStarted","Data":"2efd898b22bd55af57e50aedb142d830f57b8fac4e8bf147aaba54f45556408d"} Dec 01 11:58:16 crc kubenswrapper[4909]: I1201 11:58:16.992943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerStarted","Data":"4902f14123acb7e2fe859af5922b77d396d6a7ef677674fe576deb1c7a783c02"} Dec 01 11:58:18 crc kubenswrapper[4909]: I1201 11:58:18.003615 4909 generic.go:334] "Generic (PLEG): container finished" podID="01e46756-88f0-4c27-af83-9196fd3156ca" containerID="2efd898b22bd55af57e50aedb142d830f57b8fac4e8bf147aaba54f45556408d" exitCode=0 Dec 01 11:58:18 crc kubenswrapper[4909]: I1201 11:58:18.004169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerDied","Data":"2efd898b22bd55af57e50aedb142d830f57b8fac4e8bf147aaba54f45556408d"} Dec 01 11:58:19 crc kubenswrapper[4909]: I1201 11:58:19.020332 4909 generic.go:334] "Generic (PLEG): container finished" podID="01e46756-88f0-4c27-af83-9196fd3156ca" containerID="5e832b46e0705f9270e135e1294a303fe67a6b4f6b30eecfbc64879b385a7893" exitCode=0 Dec 01 11:58:19 crc kubenswrapper[4909]: I1201 11:58:19.020386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerDied","Data":"5e832b46e0705f9270e135e1294a303fe67a6b4f6b30eecfbc64879b385a7893"} Dec 01 11:58:19 crc kubenswrapper[4909]: I1201 11:58:19.258254 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:58:19 crc kubenswrapper[4909]: E1201 11:58:19.258508 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:58:20 crc kubenswrapper[4909]: I1201 11:58:20.029525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerStarted","Data":"e133e091b6a1b9a47ac786bf955828abb10d8dc7df3f67bc3deac1670ca0410f"} Dec 01 11:58:20 crc kubenswrapper[4909]: I1201 11:58:20.060451 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4wlrw" podStartSLOduration=2.538218125 podStartE2EDuration="5.06043004s" podCreationTimestamp="2025-12-01 11:58:15 +0000 UTC" firstStartedPulling="2025-12-01 11:58:16.995324331 +0000 UTC m=+5214.229795219" lastFinishedPulling="2025-12-01 11:58:19.517536236 +0000 UTC m=+5216.752007134" observedRunningTime="2025-12-01 11:58:20.059431718 +0000 UTC m=+5217.293902606" watchObservedRunningTime="2025-12-01 11:58:20.06043004 +0000 UTC m=+5217.294900938" Dec 01 11:58:26 crc kubenswrapper[4909]: I1201 11:58:26.292333 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:26 crc kubenswrapper[4909]: I1201 11:58:26.293927 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:26 crc kubenswrapper[4909]: I1201 11:58:26.334788 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:27 crc kubenswrapper[4909]: I1201 11:58:27.136770 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:27 crc kubenswrapper[4909]: I1201 11:58:27.570127 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:29 crc kubenswrapper[4909]: I1201 11:58:29.107002 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4wlrw" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="registry-server" containerID="cri-o://e133e091b6a1b9a47ac786bf955828abb10d8dc7df3f67bc3deac1670ca0410f" gracePeriod=2 Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.125054 4909 generic.go:334] "Generic (PLEG): container finished" podID="01e46756-88f0-4c27-af83-9196fd3156ca" containerID="e133e091b6a1b9a47ac786bf955828abb10d8dc7df3f67bc3deac1670ca0410f" exitCode=0 Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.125100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerDied","Data":"e133e091b6a1b9a47ac786bf955828abb10d8dc7df3f67bc3deac1670ca0410f"} Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.612490 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.721369 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities\") pod \"01e46756-88f0-4c27-af83-9196fd3156ca\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.721590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76945\" (UniqueName: \"kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945\") pod \"01e46756-88f0-4c27-af83-9196fd3156ca\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.721660 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content\") pod \"01e46756-88f0-4c27-af83-9196fd3156ca\" (UID: \"01e46756-88f0-4c27-af83-9196fd3156ca\") " Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.722504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities" (OuterVolumeSpecName: "utilities") pod "01e46756-88f0-4c27-af83-9196fd3156ca" (UID: "01e46756-88f0-4c27-af83-9196fd3156ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.728481 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945" (OuterVolumeSpecName: "kube-api-access-76945") pod "01e46756-88f0-4c27-af83-9196fd3156ca" (UID: "01e46756-88f0-4c27-af83-9196fd3156ca"). InnerVolumeSpecName "kube-api-access-76945". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.823719 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76945\" (UniqueName: \"kubernetes.io/projected/01e46756-88f0-4c27-af83-9196fd3156ca-kube-api-access-76945\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.823762 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.830778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01e46756-88f0-4c27-af83-9196fd3156ca" (UID: "01e46756-88f0-4c27-af83-9196fd3156ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:58:30 crc kubenswrapper[4909]: I1201 11:58:30.926452 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e46756-88f0-4c27-af83-9196fd3156ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.138183 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wlrw" event={"ID":"01e46756-88f0-4c27-af83-9196fd3156ca","Type":"ContainerDied","Data":"4902f14123acb7e2fe859af5922b77d396d6a7ef677674fe576deb1c7a783c02"} Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.138512 4909 scope.go:117] "RemoveContainer" containerID="e133e091b6a1b9a47ac786bf955828abb10d8dc7df3f67bc3deac1670ca0410f" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.138257 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wlrw" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.156937 4909 scope.go:117] "RemoveContainer" containerID="5e832b46e0705f9270e135e1294a303fe67a6b4f6b30eecfbc64879b385a7893" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.183173 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.190677 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4wlrw"] Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.201885 4909 scope.go:117] "RemoveContainer" containerID="2efd898b22bd55af57e50aedb142d830f57b8fac4e8bf147aaba54f45556408d" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.258369 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:58:31 crc kubenswrapper[4909]: E1201 11:58:31.258632 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:58:31 crc kubenswrapper[4909]: I1201 11:58:31.269093 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" path="/var/lib/kubelet/pods/01e46756-88f0-4c27-af83-9196fd3156ca/volumes" Dec 01 11:58:44 crc kubenswrapper[4909]: I1201 11:58:44.257456 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:58:44 crc kubenswrapper[4909]: E1201 11:58:44.258195 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:58:57 crc kubenswrapper[4909]: I1201 11:58:57.257587 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:58:57 crc kubenswrapper[4909]: E1201 11:58:57.258350 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 11:59:08 crc kubenswrapper[4909]: I1201 11:59:08.257616 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 11:59:09 crc kubenswrapper[4909]: I1201 11:59:09.464489 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063"} Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.139819 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl"] Dec 01 12:00:00 crc kubenswrapper[4909]: E1201 12:00:00.140819 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="extract-utilities" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.140837 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="extract-utilities" Dec 01 12:00:00 crc kubenswrapper[4909]: E1201 12:00:00.140850 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="extract-content" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.140865 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="extract-content" Dec 01 12:00:00 crc kubenswrapper[4909]: E1201 12:00:00.140890 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.140898 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.141089 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e46756-88f0-4c27-af83-9196fd3156ca" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.141657 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.149160 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.151138 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.201343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl"] Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.232659 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.232730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.232830 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8fzf\" (UniqueName: \"kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.335048 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8fzf\" (UniqueName: \"kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.335190 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.335217 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.336522 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.350337 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.352487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8fzf\" (UniqueName: \"kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf\") pod \"collect-profiles-29409840-67xpl\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.478378 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.900671 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl"] Dec 01 12:00:00 crc kubenswrapper[4909]: I1201 12:00:00.958619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" event={"ID":"460ef7cb-e22d-4f17-aa4d-d6043dd00386","Type":"ContainerStarted","Data":"ab7f924e07d457f163b61d6181f6e3c10995eb53fd67ba6d77f5e4a95e087228"} Dec 01 12:00:01 crc kubenswrapper[4909]: I1201 12:00:01.968270 4909 generic.go:334] "Generic (PLEG): container finished" podID="460ef7cb-e22d-4f17-aa4d-d6043dd00386" containerID="361addeddfd6296fa9f53e6795d101611d0712790b7b0166546cf52db230c410" exitCode=0 Dec 01 12:00:01 crc kubenswrapper[4909]: I1201 12:00:01.968378 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" event={"ID":"460ef7cb-e22d-4f17-aa4d-d6043dd00386","Type":"ContainerDied","Data":"361addeddfd6296fa9f53e6795d101611d0712790b7b0166546cf52db230c410"} Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.278722 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.390428 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8fzf\" (UniqueName: \"kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf\") pod \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.390475 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume\") pod \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.390547 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume\") pod \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\" (UID: \"460ef7cb-e22d-4f17-aa4d-d6043dd00386\") " Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.391809 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume" (OuterVolumeSpecName: "config-volume") pod "460ef7cb-e22d-4f17-aa4d-d6043dd00386" (UID: "460ef7cb-e22d-4f17-aa4d-d6043dd00386"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.397960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "460ef7cb-e22d-4f17-aa4d-d6043dd00386" (UID: "460ef7cb-e22d-4f17-aa4d-d6043dd00386"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.399115 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf" (OuterVolumeSpecName: "kube-api-access-d8fzf") pod "460ef7cb-e22d-4f17-aa4d-d6043dd00386" (UID: "460ef7cb-e22d-4f17-aa4d-d6043dd00386"). InnerVolumeSpecName "kube-api-access-d8fzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.492998 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8fzf\" (UniqueName: \"kubernetes.io/projected/460ef7cb-e22d-4f17-aa4d-d6043dd00386-kube-api-access-d8fzf\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.493044 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/460ef7cb-e22d-4f17-aa4d-d6043dd00386-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.493056 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/460ef7cb-e22d-4f17-aa4d-d6043dd00386-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.989497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" event={"ID":"460ef7cb-e22d-4f17-aa4d-d6043dd00386","Type":"ContainerDied","Data":"ab7f924e07d457f163b61d6181f6e3c10995eb53fd67ba6d77f5e4a95e087228"} Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.989914 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab7f924e07d457f163b61d6181f6e3c10995eb53fd67ba6d77f5e4a95e087228" Dec 01 12:00:03 crc kubenswrapper[4909]: I1201 12:00:03.989884 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-67xpl" Dec 01 12:00:04 crc kubenswrapper[4909]: I1201 12:00:04.345744 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr"] Dec 01 12:00:04 crc kubenswrapper[4909]: I1201 12:00:04.353629 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-4hggr"] Dec 01 12:00:05 crc kubenswrapper[4909]: I1201 12:00:05.274392 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea41a9e-ecee-46f7-90e5-dbba37333c85" path="/var/lib/kubelet/pods/fea41a9e-ecee-46f7-90e5-dbba37333c85/volumes" Dec 01 12:00:44 crc kubenswrapper[4909]: I1201 12:00:44.054694 4909 scope.go:117] "RemoveContainer" containerID="acc862312cc08563db7eb2aa7c20676aebf070cbea2600453d6de44a61ab369d" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.148533 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409841-6bnlj"] Dec 01 12:01:00 crc kubenswrapper[4909]: E1201 12:01:00.149357 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460ef7cb-e22d-4f17-aa4d-d6043dd00386" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.149372 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="460ef7cb-e22d-4f17-aa4d-d6043dd00386" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.149552 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="460ef7cb-e22d-4f17-aa4d-d6043dd00386" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.150169 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.161140 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409841-6bnlj"] Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.210526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.210807 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.210961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j258m\" (UniqueName: \"kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.211074 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.313068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.313139 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.313207 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j258m\" (UniqueName: \"kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.313240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.319168 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.319178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.319754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.333128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j258m\" (UniqueName: \"kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m\") pod \"keystone-cron-29409841-6bnlj\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.471329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:00 crc kubenswrapper[4909]: I1201 12:01:00.914297 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409841-6bnlj"] Dec 01 12:01:01 crc kubenswrapper[4909]: I1201 12:01:01.471572 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-6bnlj" event={"ID":"d0c4f751-d7f4-48fc-815d-402b69708a55","Type":"ContainerStarted","Data":"1e3f6fc156c5f8b401ab0b1e18795187137a29eeb8c9670b92dca451a36e3fac"} Dec 01 12:01:01 crc kubenswrapper[4909]: I1201 12:01:01.471856 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-6bnlj" event={"ID":"d0c4f751-d7f4-48fc-815d-402b69708a55","Type":"ContainerStarted","Data":"a263c95fe7c4ca0cb4b054e51201fff189761f8c39600183e564135ede368db9"} Dec 01 12:01:01 crc kubenswrapper[4909]: I1201 12:01:01.490583 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409841-6bnlj" podStartSLOduration=1.49056686 podStartE2EDuration="1.49056686s" podCreationTimestamp="2025-12-01 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:01:01.484908723 +0000 UTC m=+5378.719379631" watchObservedRunningTime="2025-12-01 12:01:01.49056686 +0000 UTC m=+5378.725037758" Dec 01 12:01:03 crc kubenswrapper[4909]: I1201 12:01:03.508563 4909 generic.go:334] "Generic (PLEG): container finished" podID="d0c4f751-d7f4-48fc-815d-402b69708a55" containerID="1e3f6fc156c5f8b401ab0b1e18795187137a29eeb8c9670b92dca451a36e3fac" exitCode=0 Dec 01 12:01:03 crc kubenswrapper[4909]: I1201 12:01:03.508647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-6bnlj" event={"ID":"d0c4f751-d7f4-48fc-815d-402b69708a55","Type":"ContainerDied","Data":"1e3f6fc156c5f8b401ab0b1e18795187137a29eeb8c9670b92dca451a36e3fac"} Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.837730 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.905684 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data\") pod \"d0c4f751-d7f4-48fc-815d-402b69708a55\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.905754 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle\") pod \"d0c4f751-d7f4-48fc-815d-402b69708a55\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.905841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys\") pod \"d0c4f751-d7f4-48fc-815d-402b69708a55\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.905927 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j258m\" (UniqueName: \"kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m\") pod \"d0c4f751-d7f4-48fc-815d-402b69708a55\" (UID: \"d0c4f751-d7f4-48fc-815d-402b69708a55\") " Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.911280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d0c4f751-d7f4-48fc-815d-402b69708a55" (UID: "d0c4f751-d7f4-48fc-815d-402b69708a55"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.911468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m" (OuterVolumeSpecName: "kube-api-access-j258m") pod "d0c4f751-d7f4-48fc-815d-402b69708a55" (UID: "d0c4f751-d7f4-48fc-815d-402b69708a55"). InnerVolumeSpecName "kube-api-access-j258m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.936824 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0c4f751-d7f4-48fc-815d-402b69708a55" (UID: "d0c4f751-d7f4-48fc-815d-402b69708a55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:04 crc kubenswrapper[4909]: I1201 12:01:04.954036 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data" (OuterVolumeSpecName: "config-data") pod "d0c4f751-d7f4-48fc-815d-402b69708a55" (UID: "d0c4f751-d7f4-48fc-815d-402b69708a55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.008157 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.008196 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.008207 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0c4f751-d7f4-48fc-815d-402b69708a55-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.008217 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j258m\" (UniqueName: \"kubernetes.io/projected/d0c4f751-d7f4-48fc-815d-402b69708a55-kube-api-access-j258m\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.528087 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-6bnlj" event={"ID":"d0c4f751-d7f4-48fc-815d-402b69708a55","Type":"ContainerDied","Data":"a263c95fe7c4ca0cb4b054e51201fff189761f8c39600183e564135ede368db9"} Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.528382 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a263c95fe7c4ca0cb4b054e51201fff189761f8c39600183e564135ede368db9" Dec 01 12:01:05 crc kubenswrapper[4909]: I1201 12:01:05.528117 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-6bnlj" Dec 01 12:01:36 crc kubenswrapper[4909]: I1201 12:01:36.194245 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:01:36 crc kubenswrapper[4909]: I1201 12:01:36.194995 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:02:06 crc kubenswrapper[4909]: I1201 12:02:06.193700 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:02:06 crc kubenswrapper[4909]: I1201 12:02:06.194268 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.271453 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:12 crc kubenswrapper[4909]: E1201 12:02:12.272344 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c4f751-d7f4-48fc-815d-402b69708a55" containerName="keystone-cron" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.272358 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c4f751-d7f4-48fc-815d-402b69708a55" containerName="keystone-cron" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.272567 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c4f751-d7f4-48fc-815d-402b69708a55" containerName="keystone-cron" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.274110 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.287097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.438657 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.438712 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcr77\" (UniqueName: \"kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.438769 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.540518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcr77\" (UniqueName: \"kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.540568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.540611 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.541132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.541292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.565252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcr77\" (UniqueName: \"kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77\") pod \"redhat-marketplace-z9j9k\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:12 crc kubenswrapper[4909]: I1201 12:02:12.638354 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:13 crc kubenswrapper[4909]: I1201 12:02:13.074549 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:13 crc kubenswrapper[4909]: W1201 12:02:13.077552 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2678a022_9d18_4843_9a3a_4ef00f3c4451.slice/crio-1fbf7b299863fa20ca61d657fa2e68e074eac24f53352bb7e2c6a8ef505d442a WatchSource:0}: Error finding container 1fbf7b299863fa20ca61d657fa2e68e074eac24f53352bb7e2c6a8ef505d442a: Status 404 returned error can't find the container with id 1fbf7b299863fa20ca61d657fa2e68e074eac24f53352bb7e2c6a8ef505d442a Dec 01 12:02:13 crc kubenswrapper[4909]: I1201 12:02:13.104002 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerStarted","Data":"1fbf7b299863fa20ca61d657fa2e68e074eac24f53352bb7e2c6a8ef505d442a"} Dec 01 12:02:14 crc kubenswrapper[4909]: I1201 12:02:14.113434 4909 generic.go:334] "Generic (PLEG): container finished" podID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerID="b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f" exitCode=0 Dec 01 12:02:14 crc kubenswrapper[4909]: I1201 12:02:14.113494 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerDied","Data":"b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f"} Dec 01 12:02:16 crc kubenswrapper[4909]: I1201 12:02:16.130516 4909 generic.go:334] "Generic (PLEG): container finished" podID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerID="90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290" exitCode=0 Dec 01 12:02:16 crc kubenswrapper[4909]: I1201 12:02:16.130590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerDied","Data":"90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290"} Dec 01 12:02:18 crc kubenswrapper[4909]: I1201 12:02:18.164161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerStarted","Data":"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660"} Dec 01 12:02:18 crc kubenswrapper[4909]: I1201 12:02:18.190361 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z9j9k" podStartSLOduration=3.047547329 podStartE2EDuration="6.190326351s" podCreationTimestamp="2025-12-01 12:02:12 +0000 UTC" firstStartedPulling="2025-12-01 12:02:14.115477426 +0000 UTC m=+5451.349948324" lastFinishedPulling="2025-12-01 12:02:17.258256448 +0000 UTC m=+5454.492727346" observedRunningTime="2025-12-01 12:02:18.185910052 +0000 UTC m=+5455.420380960" watchObservedRunningTime="2025-12-01 12:02:18.190326351 +0000 UTC m=+5455.424797259" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.707361 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.711234 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.744825 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.826586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.826637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggts\" (UniqueName: \"kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.826690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.928024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.928072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggts\" (UniqueName: \"kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.928123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.928798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.928802 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:21 crc kubenswrapper[4909]: I1201 12:02:21.957676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggts\" (UniqueName: \"kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts\") pod \"certified-operators-hpn68\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:22 crc kubenswrapper[4909]: I1201 12:02:22.045478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:22 crc kubenswrapper[4909]: I1201 12:02:22.561850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:22 crc kubenswrapper[4909]: I1201 12:02:22.638687 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:22 crc kubenswrapper[4909]: I1201 12:02:22.638734 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:22 crc kubenswrapper[4909]: I1201 12:02:22.688147 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:23 crc kubenswrapper[4909]: I1201 12:02:23.221668 4909 generic.go:334] "Generic (PLEG): container finished" podID="769f194a-a720-4207-ac28-e63813534aa4" containerID="6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc" exitCode=0 Dec 01 12:02:23 crc kubenswrapper[4909]: I1201 12:02:23.221740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerDied","Data":"6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc"} Dec 01 12:02:23 crc kubenswrapper[4909]: I1201 12:02:23.223192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerStarted","Data":"93ebd7590af9dc6e68e11e98e56e940c49ad73c85608a4096c8816d2e582dbaa"} Dec 01 12:02:23 crc kubenswrapper[4909]: I1201 12:02:23.309327 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.080271 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.251935 4909 generic.go:334] "Generic (PLEG): container finished" podID="769f194a-a720-4207-ac28-e63813534aa4" containerID="506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002" exitCode=0 Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.252034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerDied","Data":"506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002"} Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.252270 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z9j9k" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="registry-server" containerID="cri-o://ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660" gracePeriod=2 Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.715145 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.843103 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content\") pod \"2678a022-9d18-4843-9a3a-4ef00f3c4451\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.843563 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcr77\" (UniqueName: \"kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77\") pod \"2678a022-9d18-4843-9a3a-4ef00f3c4451\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.843713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities\") pod \"2678a022-9d18-4843-9a3a-4ef00f3c4451\" (UID: \"2678a022-9d18-4843-9a3a-4ef00f3c4451\") " Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.844893 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities" (OuterVolumeSpecName: "utilities") pod "2678a022-9d18-4843-9a3a-4ef00f3c4451" (UID: "2678a022-9d18-4843-9a3a-4ef00f3c4451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.853298 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77" (OuterVolumeSpecName: "kube-api-access-vcr77") pod "2678a022-9d18-4843-9a3a-4ef00f3c4451" (UID: "2678a022-9d18-4843-9a3a-4ef00f3c4451"). InnerVolumeSpecName "kube-api-access-vcr77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.865750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2678a022-9d18-4843-9a3a-4ef00f3c4451" (UID: "2678a022-9d18-4843-9a3a-4ef00f3c4451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.946034 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcr77\" (UniqueName: \"kubernetes.io/projected/2678a022-9d18-4843-9a3a-4ef00f3c4451-kube-api-access-vcr77\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.946069 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:25 crc kubenswrapper[4909]: I1201 12:02:25.946078 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2678a022-9d18-4843-9a3a-4ef00f3c4451-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.266642 4909 generic.go:334] "Generic (PLEG): container finished" podID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerID="ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660" exitCode=0 Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.266730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerDied","Data":"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660"} Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.266766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9j9k" event={"ID":"2678a022-9d18-4843-9a3a-4ef00f3c4451","Type":"ContainerDied","Data":"1fbf7b299863fa20ca61d657fa2e68e074eac24f53352bb7e2c6a8ef505d442a"} Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.266789 4909 scope.go:117] "RemoveContainer" containerID="ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.266840 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9j9k" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.272369 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerStarted","Data":"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf"} Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.290448 4909 scope.go:117] "RemoveContainer" containerID="90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.309040 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpn68" podStartSLOduration=2.836198629 podStartE2EDuration="5.309000748s" podCreationTimestamp="2025-12-01 12:02:21 +0000 UTC" firstStartedPulling="2025-12-01 12:02:23.226038579 +0000 UTC m=+5460.460509477" lastFinishedPulling="2025-12-01 12:02:25.698840708 +0000 UTC m=+5462.933311596" observedRunningTime="2025-12-01 12:02:26.296145514 +0000 UTC m=+5463.530616442" watchObservedRunningTime="2025-12-01 12:02:26.309000748 +0000 UTC m=+5463.543471666" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.343385 4909 scope.go:117] "RemoveContainer" containerID="b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.346630 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.359075 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9j9k"] Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.371511 4909 scope.go:117] "RemoveContainer" containerID="ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660" Dec 01 12:02:26 crc kubenswrapper[4909]: E1201 12:02:26.372044 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660\": container with ID starting with ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660 not found: ID does not exist" containerID="ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.372088 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660"} err="failed to get container status \"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660\": rpc error: code = NotFound desc = could not find container \"ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660\": container with ID starting with ba9171215bfed2842a42f660dcd4952c516abf4d58e96ba07feb5357f5d65660 not found: ID does not exist" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.372124 4909 scope.go:117] "RemoveContainer" containerID="90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290" Dec 01 12:02:26 crc kubenswrapper[4909]: E1201 12:02:26.372349 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290\": container with ID starting with 90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290 not found: ID does not exist" containerID="90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.372386 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290"} err="failed to get container status \"90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290\": rpc error: code = NotFound desc = could not find container \"90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290\": container with ID starting with 90e3544b88e1c9f07ea5f73ccccfb7741d5c67bdcd054bd61f7f4fffd39f8290 not found: ID does not exist" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.372406 4909 scope.go:117] "RemoveContainer" containerID="b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f" Dec 01 12:02:26 crc kubenswrapper[4909]: E1201 12:02:26.372619 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f\": container with ID starting with b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f not found: ID does not exist" containerID="b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f" Dec 01 12:02:26 crc kubenswrapper[4909]: I1201 12:02:26.372652 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f"} err="failed to get container status \"b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f\": rpc error: code = NotFound desc = could not find container \"b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f\": container with ID starting with b61f0c43f2b4256c909576d680f9cb750d3effc26e87e7c83962c76719f9ca7f not found: ID does not exist" Dec 01 12:02:27 crc kubenswrapper[4909]: I1201 12:02:27.280230 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" path="/var/lib/kubelet/pods/2678a022-9d18-4843-9a3a-4ef00f3c4451/volumes" Dec 01 12:02:29 crc kubenswrapper[4909]: I1201 12:02:29.308202 4909 generic.go:334] "Generic (PLEG): container finished" podID="50803ad1-ed73-4df3-bddc-6d8fd12aa087" containerID="33532eff699c87e2f8dd804f4a287112f8972e1f2ed33d6038036e3c163cd387" exitCode=2 Dec 01 12:02:29 crc kubenswrapper[4909]: I1201 12:02:29.308317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" event={"ID":"50803ad1-ed73-4df3-bddc-6d8fd12aa087","Type":"ContainerDied","Data":"33532eff699c87e2f8dd804f4a287112f8972e1f2ed33d6038036e3c163cd387"} Dec 01 12:02:30 crc kubenswrapper[4909]: I1201 12:02:30.999462 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.174940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.175407 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.175524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmjqk\" (UniqueName: \"kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.175549 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.175578 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.175617 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key\") pod \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\" (UID: \"50803ad1-ed73-4df3-bddc-6d8fd12aa087\") " Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.183285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk" (OuterVolumeSpecName: "kube-api-access-kmjqk") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "kube-api-access-kmjqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.183817 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph" (OuterVolumeSpecName: "ceph") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.190103 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.211569 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.212093 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory" (OuterVolumeSpecName: "inventory") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.214440 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "50803ad1-ed73-4df3-bddc-6d8fd12aa087" (UID: "50803ad1-ed73-4df3-bddc-6d8fd12aa087"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278006 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278069 4909 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278082 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278100 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmjqk\" (UniqueName: \"kubernetes.io/projected/50803ad1-ed73-4df3-bddc-6d8fd12aa087-kube-api-access-kmjqk\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278120 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.278141 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/50803ad1-ed73-4df3-bddc-6d8fd12aa087-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.332302 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" event={"ID":"50803ad1-ed73-4df3-bddc-6d8fd12aa087","Type":"ContainerDied","Data":"bfbdf42d200db9bf41d5be64ac776aeb3c56ef1bf1b4b3b0bdc514e678118d0f"} Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.332355 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf" Dec 01 12:02:31 crc kubenswrapper[4909]: I1201 12:02:31.332370 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbdf42d200db9bf41d5be64ac776aeb3c56ef1bf1b4b3b0bdc514e678118d0f" Dec 01 12:02:32 crc kubenswrapper[4909]: I1201 12:02:32.046720 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:32 crc kubenswrapper[4909]: I1201 12:02:32.048602 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:32 crc kubenswrapper[4909]: I1201 12:02:32.101273 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:32 crc kubenswrapper[4909]: I1201 12:02:32.395734 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:32 crc kubenswrapper[4909]: I1201 12:02:32.462214 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.363930 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpn68" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="registry-server" containerID="cri-o://b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf" gracePeriod=2 Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.812385 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.972013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ggts\" (UniqueName: \"kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts\") pod \"769f194a-a720-4207-ac28-e63813534aa4\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.972065 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities\") pod \"769f194a-a720-4207-ac28-e63813534aa4\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.972280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content\") pod \"769f194a-a720-4207-ac28-e63813534aa4\" (UID: \"769f194a-a720-4207-ac28-e63813534aa4\") " Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.973325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities" (OuterVolumeSpecName: "utilities") pod "769f194a-a720-4207-ac28-e63813534aa4" (UID: "769f194a-a720-4207-ac28-e63813534aa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:34 crc kubenswrapper[4909]: I1201 12:02:34.993936 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts" (OuterVolumeSpecName: "kube-api-access-5ggts") pod "769f194a-a720-4207-ac28-e63813534aa4" (UID: "769f194a-a720-4207-ac28-e63813534aa4"). InnerVolumeSpecName "kube-api-access-5ggts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.036101 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "769f194a-a720-4207-ac28-e63813534aa4" (UID: "769f194a-a720-4207-ac28-e63813534aa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.074242 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ggts\" (UniqueName: \"kubernetes.io/projected/769f194a-a720-4207-ac28-e63813534aa4-kube-api-access-5ggts\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.074302 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.074318 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/769f194a-a720-4207-ac28-e63813534aa4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.382058 4909 generic.go:334] "Generic (PLEG): container finished" podID="769f194a-a720-4207-ac28-e63813534aa4" containerID="b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf" exitCode=0 Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.382133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerDied","Data":"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf"} Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.382177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpn68" event={"ID":"769f194a-a720-4207-ac28-e63813534aa4","Type":"ContainerDied","Data":"93ebd7590af9dc6e68e11e98e56e940c49ad73c85608a4096c8816d2e582dbaa"} Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.382203 4909 scope.go:117] "RemoveContainer" containerID="b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.382219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpn68" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.425405 4909 scope.go:117] "RemoveContainer" containerID="506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.426075 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.447613 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpn68"] Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.467004 4909 scope.go:117] "RemoveContainer" containerID="6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.519960 4909 scope.go:117] "RemoveContainer" containerID="b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf" Dec 01 12:02:35 crc kubenswrapper[4909]: E1201 12:02:35.520749 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf\": container with ID starting with b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf not found: ID does not exist" containerID="b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.520786 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf"} err="failed to get container status \"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf\": rpc error: code = NotFound desc = could not find container \"b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf\": container with ID starting with b212b75c58e50222f5239b3708b74dc598a2f6efbfe489945b327c367f302dbf not found: ID does not exist" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.520822 4909 scope.go:117] "RemoveContainer" containerID="506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002" Dec 01 12:02:35 crc kubenswrapper[4909]: E1201 12:02:35.521212 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002\": container with ID starting with 506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002 not found: ID does not exist" containerID="506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.521250 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002"} err="failed to get container status \"506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002\": rpc error: code = NotFound desc = could not find container \"506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002\": container with ID starting with 506629c97c1ed42930d804629e3b45fd173e290528d1041eec0edc034fc93002 not found: ID does not exist" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.521271 4909 scope.go:117] "RemoveContainer" containerID="6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc" Dec 01 12:02:35 crc kubenswrapper[4909]: E1201 12:02:35.521615 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc\": container with ID starting with 6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc not found: ID does not exist" containerID="6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc" Dec 01 12:02:35 crc kubenswrapper[4909]: I1201 12:02:35.521676 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc"} err="failed to get container status \"6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc\": rpc error: code = NotFound desc = could not find container \"6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc\": container with ID starting with 6bb6d47d0b495f7949033386595ed928357dc946836880806911134e0ea904bc not found: ID does not exist" Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.194247 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.194337 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.194404 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.195550 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.195634 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063" gracePeriod=600 Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.396915 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063" exitCode=0 Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.397409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063"} Dec 01 12:02:36 crc kubenswrapper[4909]: I1201 12:02:36.397459 4909 scope.go:117] "RemoveContainer" containerID="3f5d3e5e0c40efd3ce4d47f7b5dd103441581481fd9bfbdbb030607921440807" Dec 01 12:02:37 crc kubenswrapper[4909]: I1201 12:02:37.271757 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769f194a-a720-4207-ac28-e63813534aa4" path="/var/lib/kubelet/pods/769f194a-a720-4207-ac28-e63813534aa4/volumes" Dec 01 12:02:37 crc kubenswrapper[4909]: I1201 12:02:37.410736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3"} Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.007877 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2v9j/must-gather-zmzgc"] Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.010567 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="extract-content" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.010805 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="extract-content" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.010900 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50803ad1-ed73-4df3-bddc-6d8fd12aa087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.010962 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="50803ad1-ed73-4df3-bddc-6d8fd12aa087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.011029 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="extract-utilities" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011082 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="extract-utilities" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.011135 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011194 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.011263 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="extract-utilities" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011332 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="extract-utilities" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.011396 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011460 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: E1201 12:03:54.011523 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="extract-content" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011573 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="extract-content" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011848 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2678a022-9d18-4843-9a3a-4ef00f3c4451" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011933 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="769f194a-a720-4207-ac28-e63813534aa4" containerName="registry-server" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.011993 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="50803ad1-ed73-4df3-bddc-6d8fd12aa087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.013402 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.026573 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r2v9j"/"openshift-service-ca.crt" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.026998 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r2v9j"/"default-dockercfg-kc88k" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.027247 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r2v9j"/"kube-root-ca.crt" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.027558 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r2v9j/must-gather-zmzgc"] Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.172699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.172828 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74shm\" (UniqueName: \"kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.275704 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.275945 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74shm\" (UniqueName: \"kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.276207 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.303547 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74shm\" (UniqueName: \"kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm\") pod \"must-gather-zmzgc\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.341387 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.807560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r2v9j/must-gather-zmzgc"] Dec 01 12:03:54 crc kubenswrapper[4909]: I1201 12:03:54.822770 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:03:55 crc kubenswrapper[4909]: I1201 12:03:55.111044 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" event={"ID":"219f1941-d19d-4d35-9783-49159faf5cf4","Type":"ContainerStarted","Data":"a23b85fe6e7c4ce776cfa4ef70d5a617058d701b762f83287eaec75fcd58183c"} Dec 01 12:04:00 crc kubenswrapper[4909]: I1201 12:04:00.157714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" event={"ID":"219f1941-d19d-4d35-9783-49159faf5cf4","Type":"ContainerStarted","Data":"787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4"} Dec 01 12:04:00 crc kubenswrapper[4909]: I1201 12:04:00.158316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" event={"ID":"219f1941-d19d-4d35-9783-49159faf5cf4","Type":"ContainerStarted","Data":"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873"} Dec 01 12:04:00 crc kubenswrapper[4909]: I1201 12:04:00.175893 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" podStartSLOduration=3.040959485 podStartE2EDuration="7.17585576s" podCreationTimestamp="2025-12-01 12:03:53 +0000 UTC" firstStartedPulling="2025-12-01 12:03:54.822472981 +0000 UTC m=+5552.056943889" lastFinishedPulling="2025-12-01 12:03:58.957369266 +0000 UTC m=+5556.191840164" observedRunningTime="2025-12-01 12:04:00.171897325 +0000 UTC m=+5557.406368223" watchObservedRunningTime="2025-12-01 12:04:00.17585576 +0000 UTC m=+5557.410326658" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.105803 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-bnp28"] Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.107638 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.246447 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfkg\" (UniqueName: \"kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.246890 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.348509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfkg\" (UniqueName: \"kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.349190 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.349337 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.381635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfkg\" (UniqueName: \"kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg\") pod \"crc-debug-bnp28\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:03 crc kubenswrapper[4909]: I1201 12:04:03.426134 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:04 crc kubenswrapper[4909]: I1201 12:04:04.193161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" event={"ID":"5734552c-747f-488d-bc60-66913157542e","Type":"ContainerStarted","Data":"8f07c733639d490de6df297a1694980fd3baa476ed5c15dc072a9334f2e797e8"} Dec 01 12:04:14 crc kubenswrapper[4909]: I1201 12:04:14.291405 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" event={"ID":"5734552c-747f-488d-bc60-66913157542e","Type":"ContainerStarted","Data":"4b451188051d38eada5cd8869ec1187a1ed6aca217e1934015d2ecb6458a5e0f"} Dec 01 12:04:14 crc kubenswrapper[4909]: I1201 12:04:14.311285 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" podStartSLOduration=1.2688874239999999 podStartE2EDuration="11.31126638s" podCreationTimestamp="2025-12-01 12:04:03 +0000 UTC" firstStartedPulling="2025-12-01 12:04:03.462994623 +0000 UTC m=+5560.697465521" lastFinishedPulling="2025-12-01 12:04:13.505373579 +0000 UTC m=+5570.739844477" observedRunningTime="2025-12-01 12:04:14.3042538 +0000 UTC m=+5571.538724698" watchObservedRunningTime="2025-12-01 12:04:14.31126638 +0000 UTC m=+5571.545737278" Dec 01 12:04:31 crc kubenswrapper[4909]: I1201 12:04:31.456396 4909 generic.go:334] "Generic (PLEG): container finished" podID="5734552c-747f-488d-bc60-66913157542e" containerID="4b451188051d38eada5cd8869ec1187a1ed6aca217e1934015d2ecb6458a5e0f" exitCode=0 Dec 01 12:04:31 crc kubenswrapper[4909]: I1201 12:04:31.456495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" event={"ID":"5734552c-747f-488d-bc60-66913157542e","Type":"ContainerDied","Data":"4b451188051d38eada5cd8869ec1187a1ed6aca217e1934015d2ecb6458a5e0f"} Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.573563 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.609675 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-bnp28"] Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.616360 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host\") pod \"5734552c-747f-488d-bc60-66913157542e\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.616535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxfkg\" (UniqueName: \"kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg\") pod \"5734552c-747f-488d-bc60-66913157542e\" (UID: \"5734552c-747f-488d-bc60-66913157542e\") " Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.616517 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host" (OuterVolumeSpecName: "host") pod "5734552c-747f-488d-bc60-66913157542e" (UID: "5734552c-747f-488d-bc60-66913157542e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.617166 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5734552c-747f-488d-bc60-66913157542e-host\") on node \"crc\" DevicePath \"\"" Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.622585 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-bnp28"] Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.625326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg" (OuterVolumeSpecName: "kube-api-access-lxfkg") pod "5734552c-747f-488d-bc60-66913157542e" (UID: "5734552c-747f-488d-bc60-66913157542e"). InnerVolumeSpecName "kube-api-access-lxfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:04:32 crc kubenswrapper[4909]: I1201 12:04:32.719177 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxfkg\" (UniqueName: \"kubernetes.io/projected/5734552c-747f-488d-bc60-66913157542e-kube-api-access-lxfkg\") on node \"crc\" DevicePath \"\"" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.271964 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5734552c-747f-488d-bc60-66913157542e" path="/var/lib/kubelet/pods/5734552c-747f-488d-bc60-66913157542e/volumes" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.475021 4909 scope.go:117] "RemoveContainer" containerID="4b451188051d38eada5cd8869ec1187a1ed6aca217e1934015d2ecb6458a5e0f" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.475183 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-bnp28" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.774763 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-2tfsd"] Dec 01 12:04:33 crc kubenswrapper[4909]: E1201 12:04:33.775243 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5734552c-747f-488d-bc60-66913157542e" containerName="container-00" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.775260 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5734552c-747f-488d-bc60-66913157542e" containerName="container-00" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.775509 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5734552c-747f-488d-bc60-66913157542e" containerName="container-00" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.776272 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.837891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nw4\" (UniqueName: \"kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.838203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.939381 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nw4\" (UniqueName: \"kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.939635 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.939753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:33 crc kubenswrapper[4909]: I1201 12:04:33.959628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nw4\" (UniqueName: \"kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4\") pod \"crc-debug-2tfsd\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.092045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.486221 4909 generic.go:334] "Generic (PLEG): container finished" podID="5f321034-fbdd-4d53-b335-46d7f8976f2f" containerID="a9df24f0ef969ce280505df69a3a9ebdaa049bceb83570e0149af3407330df37" exitCode=1 Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.486307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" event={"ID":"5f321034-fbdd-4d53-b335-46d7f8976f2f","Type":"ContainerDied","Data":"a9df24f0ef969ce280505df69a3a9ebdaa049bceb83570e0149af3407330df37"} Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.486375 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" event={"ID":"5f321034-fbdd-4d53-b335-46d7f8976f2f","Type":"ContainerStarted","Data":"969df4a9d3b9699e7e020a87ab1e888be3e9d55638db03118ed2155f186c990a"} Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.529792 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-2tfsd"] Dec 01 12:04:34 crc kubenswrapper[4909]: I1201 12:04:34.539982 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2v9j/crc-debug-2tfsd"] Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.579846 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.675486 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7nw4\" (UniqueName: \"kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4\") pod \"5f321034-fbdd-4d53-b335-46d7f8976f2f\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.675566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host\") pod \"5f321034-fbdd-4d53-b335-46d7f8976f2f\" (UID: \"5f321034-fbdd-4d53-b335-46d7f8976f2f\") " Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.675620 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host" (OuterVolumeSpecName: "host") pod "5f321034-fbdd-4d53-b335-46d7f8976f2f" (UID: "5f321034-fbdd-4d53-b335-46d7f8976f2f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.676132 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f321034-fbdd-4d53-b335-46d7f8976f2f-host\") on node \"crc\" DevicePath \"\"" Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.680806 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4" (OuterVolumeSpecName: "kube-api-access-m7nw4") pod "5f321034-fbdd-4d53-b335-46d7f8976f2f" (UID: "5f321034-fbdd-4d53-b335-46d7f8976f2f"). InnerVolumeSpecName "kube-api-access-m7nw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:04:35 crc kubenswrapper[4909]: I1201 12:04:35.777721 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7nw4\" (UniqueName: \"kubernetes.io/projected/5f321034-fbdd-4d53-b335-46d7f8976f2f-kube-api-access-m7nw4\") on node \"crc\" DevicePath \"\"" Dec 01 12:04:36 crc kubenswrapper[4909]: I1201 12:04:36.193419 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:04:36 crc kubenswrapper[4909]: I1201 12:04:36.193485 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:04:36 crc kubenswrapper[4909]: I1201 12:04:36.504976 4909 scope.go:117] "RemoveContainer" containerID="a9df24f0ef969ce280505df69a3a9ebdaa049bceb83570e0149af3407330df37" Dec 01 12:04:36 crc kubenswrapper[4909]: I1201 12:04:36.505016 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/crc-debug-2tfsd" Dec 01 12:04:37 crc kubenswrapper[4909]: I1201 12:04:37.268641 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f321034-fbdd-4d53-b335-46d7f8976f2f" path="/var/lib/kubelet/pods/5f321034-fbdd-4d53-b335-46d7f8976f2f/volumes" Dec 01 12:05:06 crc kubenswrapper[4909]: I1201 12:05:06.193204 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:05:06 crc kubenswrapper[4909]: I1201 12:05:06.193672 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.316668 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-744d7cd6db-d2s8r_3d56ac4f-6def-4711-bc77-98ed18152626/barbican-api/0.log" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.548108 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-744d7cd6db-d2s8r_3d56ac4f-6def-4711-bc77-98ed18152626/barbican-api-log/0.log" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.564690 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c459b97d-lt964_b0fb7293-36bc-4a84-b9a7-9eec7a62367b/barbican-keystone-listener/0.log" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.658603 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c459b97d-lt964_b0fb7293-36bc-4a84-b9a7-9eec7a62367b/barbican-keystone-listener-log/0.log" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.812539 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-779888c757-glwbd_a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e/barbican-worker-log/0.log" Dec 01 12:05:08 crc kubenswrapper[4909]: I1201 12:05:08.833624 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-779888c757-glwbd_a68f4e1f-b6e9-4e1b-96b7-5522a8330c9e/barbican-worker/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.041031 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4jr7f_4e2a41d6-d2aa-4db3-828a-e66538f66af0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.139239 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_98e4c943-29ab-4bb6-ab5d-a63b167e6e2f/ceilometer-central-agent/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.210574 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_98e4c943-29ab-4bb6-ab5d-a63b167e6e2f/ceilometer-notification-agent/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.273957 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_98e4c943-29ab-4bb6-ab5d-a63b167e6e2f/proxy-httpd/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.325140 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_98e4c943-29ab-4bb6-ab5d-a63b167e6e2f/sg-core/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.418029 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-jw5vd_0008cf3a-bf47-4886-a3a2-3b68c09d8ff1/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.523032 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sdwvf_992fb0af-fdd9-464d-92cc-454f8cc2cfb4/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.778466 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2c19b6f5-fcf4-4655-bdaa-10257b92f6dd/cinder-api-log/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.781973 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2c19b6f5-fcf4-4655-bdaa-10257b92f6dd/cinder-api/0.log" Dec 01 12:05:09 crc kubenswrapper[4909]: I1201 12:05:09.935800 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da6d9185-171d-4197-9265-4252b98166e7/cinder-scheduler/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.065097 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da6d9185-171d-4197-9265-4252b98166e7/probe/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.082352 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-562bw_a884bb23-e602-4578-8fa9-ac8b6fdf5eb8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.278500 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b24p9_df51a24f-6b29-49b3-bdee-153cb29154fe/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.352316 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-864d5fc68c-4f592_68b41abf-32f2-4e47-b5ac-f1e689edda28/init/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.542757 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-864d5fc68c-4f592_68b41abf-32f2-4e47-b5ac-f1e689edda28/init/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.665006 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zrjvc_365bd5e9-5b82-4c6d-b8cf-57a73146653e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.694249 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-864d5fc68c-4f592_68b41abf-32f2-4e47-b5ac-f1e689edda28/dnsmasq-dns/0.log" Dec 01 12:05:10 crc kubenswrapper[4909]: I1201 12:05:10.889323 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-72glh_bb6d5a74-703d-4f32-9066-61a28fbad67f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.068801 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-58dc5cfbbd-v7pkq_c6bb8d66-286d-4137-ab50-a69fec49ab3c/keystone-api/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.149234 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409781-m7z6j_264e1bcf-af02-4231-9f5e-84f4bae0db08/keystone-cron/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.317128 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409841-6bnlj_d0c4f751-d7f4-48fc-815d-402b69708a55/keystone-cron/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.384353 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7b55ffd9-122b-4a3b-b592-99f375e261fe/kube-state-metrics/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.518625 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9fb6x_447d20e1-014c-49bb-a3c9-9057b255a1ed/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.596066 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-djfbt_62961846-7ca9-4d96-8f98-84570706b555/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.797402 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fmxdf_50803ad1-ed73-4df3-bddc-6d8fd12aa087/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:11 crc kubenswrapper[4909]: I1201 12:05:11.933470 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rnk2b_97aa464f-8d31-450b-a16a-4c6538c27bbb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.042527 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-v9khp_6e92c763-dfb0-42ce-ae14-d0196b547985/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.464200 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zshgn_fde0fbe6-aa69-45a1-ac7e-a1db200b909a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.464419 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z9wzr_fb02b769-e281-4b1d-8bdc-b414fa58587f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.540649 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f697f85-549sp_65cec39f-6062-43a3-bf73-d7091a16e0a0/neutron-api/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.693707 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f697f85-549sp_65cec39f-6062-43a3-bf73-d7091a16e0a0/neutron-httpd/0.log" Dec 01 12:05:12 crc kubenswrapper[4909]: I1201 12:05:12.778669 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kbfc5_4c73174a-75a2-47d9-82f9-5b26ea497032/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.129701 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5d7baabf-a92d-4e97-847f-aa1a692d206f/nova-api-log/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.187293 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c254210a-6515-499c-b95c-1bf34961cf05/nova-cell0-conductor-conductor/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.491695 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_825ab14c-14ed-4ca8-a367-15ed10bf1bc9/nova-cell1-conductor-conductor/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.627992 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5d7baabf-a92d-4e97-847f-aa1a692d206f/nova-api-api/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.730575 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e1abbfc5-9c24-418a-be94-4a74fd32e687/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 12:05:13 crc kubenswrapper[4909]: I1201 12:05:13.826303 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da6bc7be-6a1b-42f5-ae7c-1c7a5288755e/nova-metadata-log/0.log" Dec 01 12:05:14 crc kubenswrapper[4909]: I1201 12:05:14.132740 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_536ef4f2-2531-42e1-8c55-44e21e282a03/nova-scheduler-scheduler/0.log" Dec 01 12:05:14 crc kubenswrapper[4909]: I1201 12:05:14.190323 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9/mysql-bootstrap/0.log" Dec 01 12:05:14 crc kubenswrapper[4909]: I1201 12:05:14.409600 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9/galera/0.log" Dec 01 12:05:14 crc kubenswrapper[4909]: I1201 12:05:14.417757 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b9b91481-b2f9-4ad4-8720-6b9f7d27c2b9/mysql-bootstrap/0.log" Dec 01 12:05:14 crc kubenswrapper[4909]: I1201 12:05:14.654498 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5b522c19-139d-41b2-ad31-94e12157a398/mysql-bootstrap/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.023036 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5b522c19-139d-41b2-ad31-94e12157a398/mysql-bootstrap/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.031346 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5b522c19-139d-41b2-ad31-94e12157a398/galera/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.267142 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_09ab22ab-a48e-4e11-b498-9f78812079b1/openstackclient/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.401078 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mq7gt_5dbfc51e-589a-43e5-805d-e5856f361b43/openstack-network-exporter/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.535966 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z64hl_cfbdb448-fb4d-48cd-8fa4-b9309172c7f4/ovsdb-server-init/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.749007 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z64hl_cfbdb448-fb4d-48cd-8fa4-b9309172c7f4/ovs-vswitchd/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.762748 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z64hl_cfbdb448-fb4d-48cd-8fa4-b9309172c7f4/ovsdb-server-init/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.836286 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z64hl_cfbdb448-fb4d-48cd-8fa4-b9309172c7f4/ovsdb-server/0.log" Dec 01 12:05:15 crc kubenswrapper[4909]: I1201 12:05:15.983587 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zcgpf_a2ab5fcc-f33d-4495-a7e4-c4305f3e846e/ovn-controller/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.225542 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da6bc7be-6a1b-42f5-ae7c-1c7a5288755e/nova-metadata-metadata/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.236382 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qfv6g_8fd4dda9-f078-4d82-bbf0-040ef5d994cb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.311049 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b1856063-0b40-4fee-ab30-024128a88da8/openstack-network-exporter/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.461280 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcc01640-a3cc-49fe-b49a-ef344e34d793/openstack-network-exporter/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.477776 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b1856063-0b40-4fee-ab30-024128a88da8/ovn-northd/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.647237 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dcc01640-a3cc-49fe-b49a-ef344e34d793/ovsdbserver-nb/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.797342 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c44b2fd1-af1d-4e0a-8316-eedf732df3ce/openstack-network-exporter/0.log" Dec 01 12:05:16 crc kubenswrapper[4909]: I1201 12:05:16.923768 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c44b2fd1-af1d-4e0a-8316-eedf732df3ce/ovsdbserver-sb/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.088563 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7548d5fdbb-tbnwp_5e0dedc9-48c5-448f-9d8f-c2eb51401705/placement-api/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.136077 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7548d5fdbb-tbnwp_5e0dedc9-48c5-448f-9d8f-c2eb51401705/placement-log/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.159956 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46315eac-b29e-48fa-864d-f105eefd2fc3/setup-container/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.396502 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46315eac-b29e-48fa-864d-f105eefd2fc3/setup-container/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.431063 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_46315eac-b29e-48fa-864d-f105eefd2fc3/rabbitmq/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.498795 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b/setup-container/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.671979 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b/setup-container/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.677492 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ea1f7af6-3ddf-4acb-b99b-3c8bb66d377b/rabbitmq/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.802741 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b92gh_b1a2a204-dcb3-401d-8f9f-aa2d9bf5fbb5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:17 crc kubenswrapper[4909]: I1201 12:05:17.923060 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s954s_b6b42ea5-6854-49d3-82bb-d7559fc8e9d5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:18 crc kubenswrapper[4909]: I1201 12:05:18.088365 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5xbt4_948452ff-597c-4b3d-ba9a-11fb527a3c55/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:18 crc kubenswrapper[4909]: I1201 12:05:18.183312 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-52zqm_4ce039e6-b0e9-43d5-bf88-f4c169bb03ef/ssh-known-hosts-edpm-deployment/0.log" Dec 01 12:05:18 crc kubenswrapper[4909]: I1201 12:05:18.365941 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zhnd5_73edb31c-ae80-4a89-96a4-496594406256/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 12:05:20 crc kubenswrapper[4909]: I1201 12:05:20.875043 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d516e7e6-4b24-41b8-bde1-d533d1e77d61/memcached/0.log" Dec 01 12:05:36 crc kubenswrapper[4909]: I1201 12:05:36.193564 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:05:36 crc kubenswrapper[4909]: I1201 12:05:36.194489 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:05:36 crc kubenswrapper[4909]: I1201 12:05:36.194550 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" Dec 01 12:05:36 crc kubenswrapper[4909]: I1201 12:05:36.195681 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3"} pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:05:36 crc kubenswrapper[4909]: I1201 12:05:36.195772 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" containerID="cri-o://1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" gracePeriod=600 Dec 01 12:05:36 crc kubenswrapper[4909]: E1201 12:05:36.387934 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:05:37 crc kubenswrapper[4909]: I1201 12:05:37.046730 4909 generic.go:334] "Generic (PLEG): container finished" podID="672850e4-d044-44cc-b8a2-517dc1a285be" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" exitCode=0 Dec 01 12:05:37 crc kubenswrapper[4909]: I1201 12:05:37.046784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerDied","Data":"1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3"} Dec 01 12:05:37 crc kubenswrapper[4909]: I1201 12:05:37.046825 4909 scope.go:117] "RemoveContainer" containerID="2aaec19c8a09c357b1c58d724097fc2fb7674e854b41091b1ce6a1532c298063" Dec 01 12:05:37 crc kubenswrapper[4909]: I1201 12:05:37.047463 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:05:37 crc kubenswrapper[4909]: E1201 12:05:37.047753 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.105561 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/util/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.359731 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/pull/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.371473 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/util/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.425025 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/pull/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.626437 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/pull/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.629210 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/util/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.684687 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_831225e757aae934615157119572f4b3f73ef4aa61b30083b422f45366xlkfd_6e65ad27-ab58-4879-af77-a626651a5be9/extract/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.855182 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hdfcg_dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c/kube-rbac-proxy/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.890138 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hdfcg_dc6b6fc4-e26d-4401-ba4f-f87c3a5b503c/manager/0.log" Dec 01 12:05:41 crc kubenswrapper[4909]: I1201 12:05:41.955741 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-9txzx_cfac3bef-3c49-47ba-94be-d267a57f731a/kube-rbac-proxy/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.097208 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-9txzx_cfac3bef-3c49-47ba-94be-d267a57f731a/manager/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.190775 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-77s9g_19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1/kube-rbac-proxy/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.197401 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-77s9g_19f1e6b3-cf7a-45b0-b9c2-8c48bdf6ead1/manager/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.376239 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8c7f494db-q9n6n_9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41/kube-rbac-proxy/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.461811 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8c7f494db-q9n6n_9cbe8fe9-4f7c-4ba7-b6c3-ff8a074fcc41/manager/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.560477 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z98b8_67a7fb10-11c3-4fb1-90c7-ea30b122719f/kube-rbac-proxy/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.590529 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z98b8_67a7fb10-11c3-4fb1-90c7-ea30b122719f/manager/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.671985 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wjqrn_1cedb704-1d3a-4c68-b14e-00bc65309891/kube-rbac-proxy/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.776613 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-wjqrn_1cedb704-1d3a-4c68-b14e-00bc65309891/manager/0.log" Dec 01 12:05:42 crc kubenswrapper[4909]: I1201 12:05:42.867006 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-wbqsz_246fed43-c776-4c8c-a6ac-1f4364191266/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.021665 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-wbqsz_246fed43-c776-4c8c-a6ac-1f4364191266/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.081317 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-nxbp4_949868bd-22ef-467f-b767-2d6dbd02dab1/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.085593 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-nxbp4_949868bd-22ef-467f-b767-2d6dbd02dab1/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.224816 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-q7zr4_79de64b2-dbd0-4877-9556-8fb199c9786c/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.294038 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-q7zr4_79de64b2-dbd0-4877-9556-8fb199c9786c/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.416283 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-twgwh_6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.440050 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-twgwh_6d2a5c48-0cc8-4b2f-9c17-ee4d82e72e45/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.521857 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2rz9d_510857ee-52e0-4d2c-8fff-2cc3dabb0dfa/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.616762 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2rz9d_510857ee-52e0-4d2c-8fff-2cc3dabb0dfa/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.710061 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zbkjz_282ee9d9-b6ee-4dd7-bce4-419428bd744b/kube-rbac-proxy/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.779589 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zbkjz_282ee9d9-b6ee-4dd7-bce4-419428bd744b/manager/0.log" Dec 01 12:05:43 crc kubenswrapper[4909]: I1201 12:05:43.927113 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pzh2j_577bc508-a3f5-4b97-9360-12b1dfee3890/kube-rbac-proxy/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.001463 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pzh2j_577bc508-a3f5-4b97-9360-12b1dfee3890/manager/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.161034 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fh7wp_8a1199cd-df49-4078-b2ab-211f902bd097/kube-rbac-proxy/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.204104 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fh7wp_8a1199cd-df49-4078-b2ab-211f902bd097/manager/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.297810 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l_c1265544-727e-4a41-a2e4-c612230cbbc0/kube-rbac-proxy/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.405180 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z8l8l_c1265544-727e-4a41-a2e4-c612230cbbc0/manager/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.886148 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f6d49fd8b-fs7ts_3ec99ef5-053c-4815-b711-aa6798c05e05/operator/0.log" Dec 01 12:05:44 crc kubenswrapper[4909]: I1201 12:05:44.889899 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c8hpw_945864bb-ec20-4030-83cb-383340ff9107/registry-server/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.146772 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-v585g_51f475b0-ff8d-4438-8c91-c41ce79ea8d1/kube-rbac-proxy/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.244335 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7qnmh_d527652d-7e68-4494-9e69-75296fb63932/kube-rbac-proxy/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.331614 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-v585g_51f475b0-ff8d-4438-8c91-c41ce79ea8d1/manager/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.362906 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-7qnmh_d527652d-7e68-4494-9e69-75296fb63932/manager/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.478334 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nmswj_02f59677-4f80-4d7d-8921-08e112fadbc2/operator/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.761105 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-wgsw2_63d2cbce-2ef0-43e7-83da-8350663cf0c0/kube-rbac-proxy/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.762963 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-wgsw2_63d2cbce-2ef0-43e7-83da-8350663cf0c0/manager/0.log" Dec 01 12:05:45 crc kubenswrapper[4909]: I1201 12:05:45.872357 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7884db5fb-nljsq_38909617-5f76-49e1-a3ad-0c3917fddb55/manager/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.005678 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-dq9jz_3dcfb91f-b0e5-4087-99cb-09cec4cd5f72/kube-rbac-proxy/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.052481 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-dq9jz_3dcfb91f-b0e5-4087-99cb-09cec4cd5f72/manager/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.147576 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-pg7qg_8e79701a-e50d-4782-aca7-82ce16bbfa7e/kube-rbac-proxy/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.556483 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbhwr_b78befb3-d754-4c59-b0fd-d5dcb9e06588/kube-rbac-proxy/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.575652 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-pg7qg_8e79701a-e50d-4782-aca7-82ce16bbfa7e/manager/0.log" Dec 01 12:05:46 crc kubenswrapper[4909]: I1201 12:05:46.622085 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xbhwr_b78befb3-d754-4c59-b0fd-d5dcb9e06588/manager/0.log" Dec 01 12:05:50 crc kubenswrapper[4909]: I1201 12:05:50.257676 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:05:50 crc kubenswrapper[4909]: E1201 12:05:50.258529 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:06:04 crc kubenswrapper[4909]: I1201 12:06:04.257554 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:06:04 crc kubenswrapper[4909]: E1201 12:06:04.258450 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:06:06 crc kubenswrapper[4909]: I1201 12:06:06.750161 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gtxhz_9e922379-f723-4440-a90e-182b0917c969/control-plane-machine-set-operator/0.log" Dec 01 12:06:06 crc kubenswrapper[4909]: I1201 12:06:06.975251 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5wkg2_6744f6c5-4a6c-4188-b4c4-0a25aac51b1d/kube-rbac-proxy/0.log" Dec 01 12:06:07 crc kubenswrapper[4909]: I1201 12:06:07.011807 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5wkg2_6744f6c5-4a6c-4188-b4c4-0a25aac51b1d/machine-api-operator/0.log" Dec 01 12:06:17 crc kubenswrapper[4909]: I1201 12:06:17.258967 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:06:17 crc kubenswrapper[4909]: E1201 12:06:17.260676 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:06:21 crc kubenswrapper[4909]: I1201 12:06:21.503919 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-9hgvq_223fa06f-e48d-419c-848f-02792e3f9a17/cert-manager-controller/0.log" Dec 01 12:06:21 crc kubenswrapper[4909]: I1201 12:06:21.712713 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-vkfwh_da64b9e1-fde2-45af-92aa-2a376d5afbcf/cert-manager-cainjector/0.log" Dec 01 12:06:21 crc kubenswrapper[4909]: I1201 12:06:21.763741 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pfndl_ff0653be-4d31-4748-9729-1114a630b8fd/cert-manager-webhook/0.log" Dec 01 12:06:32 crc kubenswrapper[4909]: I1201 12:06:32.257347 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:06:32 crc kubenswrapper[4909]: E1201 12:06:32.258193 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.209248 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kbcdt_c33158af-8ace-44c3-bb11-587dc452768e/nmstate-console-plugin/0.log" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.502503 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rg5bv_3984210c-0ef8-41fb-b086-fc47d00b1101/nmstate-handler/0.log" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.526314 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-s6vqh_d40ebf8e-cd42-4946-ad61-06e5d3e254da/kube-rbac-proxy/0.log" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.557696 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-s6vqh_d40ebf8e-cd42-4946-ad61-06e5d3e254da/nmstate-metrics/0.log" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.801975 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-78fh2_dd3296b5-d517-4d8e-8572-107b49b7d78b/nmstate-webhook/0.log" Dec 01 12:06:35 crc kubenswrapper[4909]: I1201 12:06:35.803692 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-7dgvd_3e41ca35-79dd-414a-bfb0-d98fc06ea9ab/nmstate-operator/0.log" Dec 01 12:06:47 crc kubenswrapper[4909]: I1201 12:06:47.257631 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:06:47 crc kubenswrapper[4909]: E1201 12:06:47.258385 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.028767 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zvqbb_c3fe137b-0797-46b2-b512-cf8fa383f42d/kube-rbac-proxy/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.120766 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zvqbb_c3fe137b-0797-46b2-b512-cf8fa383f42d/controller/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.287769 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-22tnq_4e84529d-1222-4db1-9488-1ed872b096af/frr-k8s-webhook-server/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.323225 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-frr-files/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.508718 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-frr-files/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.523645 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-reloader/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.528705 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-reloader/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.534737 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-metrics/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.752024 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-metrics/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.773033 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-reloader/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.793259 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-frr-files/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.809584 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-metrics/0.log" Dec 01 12:06:51 crc kubenswrapper[4909]: I1201 12:06:51.971692 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-frr-files/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.001107 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-reloader/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.042241 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/controller/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.049672 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/cp-metrics/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.250476 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/frr-metrics/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.295339 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/kube-rbac-proxy/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.378206 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/kube-rbac-proxy-frr/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.521625 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/reloader/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.635622 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-dd49f7fb7-z5zjd_2dd20de1-a28e-4db1-9776-e206ca717a54/manager/0.log" Dec 01 12:06:52 crc kubenswrapper[4909]: I1201 12:06:52.751564 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9cd4d6db-x89s8_40db0db9-27bb-4dfe-aff9-a9a8dcd67142/webhook-server/0.log" Dec 01 12:06:53 crc kubenswrapper[4909]: I1201 12:06:53.034006 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lz22m_359939eb-1948-4c14-b573-16e8342ce29a/kube-rbac-proxy/0.log" Dec 01 12:06:53 crc kubenswrapper[4909]: I1201 12:06:53.643108 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lz22m_359939eb-1948-4c14-b573-16e8342ce29a/speaker/0.log" Dec 01 12:06:53 crc kubenswrapper[4909]: I1201 12:06:53.838973 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xdgcz_5419a1e9-7227-4f9e-89b2-ecb8469c1cd6/frr/0.log" Dec 01 12:07:01 crc kubenswrapper[4909]: I1201 12:07:01.258612 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:07:01 crc kubenswrapper[4909]: E1201 12:07:01.259982 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.341500 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/util/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.669030 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/util/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.691403 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/pull/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.738981 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/pull/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.883177 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/util/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.892578 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/pull/0.log" Dec 01 12:07:06 crc kubenswrapper[4909]: I1201 12:07:06.926932 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fx4xwz_1168e695-833f-4350-9091-31cf52abecb7/extract/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.102972 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/util/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.290744 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/pull/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.317795 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/util/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.323762 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/pull/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.450658 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/util/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.476264 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/pull/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.529843 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rmrl7_a1beb94c-1e21-4dd7-814e-c345e45dc803/extract/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.650927 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-utilities/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.830115 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-content/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.855507 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-content/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.863097 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-utilities/0.log" Dec 01 12:07:07 crc kubenswrapper[4909]: I1201 12:07:07.975927 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-utilities/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.031096 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/extract-content/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.243345 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-utilities/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.411123 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-utilities/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.487999 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-content/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.497271 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-content/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.760999 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-content/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.775544 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/extract-utilities/0.log" Dec 01 12:07:08 crc kubenswrapper[4909]: I1201 12:07:08.792126 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ffblz_2d64a6cb-82ae-49da-9518-72b6727de254/registry-server/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.090089 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8z7nq_c18487f9-9d29-4164-9652-052bc763a829/registry-server/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.151851 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hqgll_6e745987-2227-4479-9ea8-3bf3ce5ba444/marketplace-operator/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.385992 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-utilities/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.563681 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-utilities/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.572344 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-content/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.595436 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-content/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.790978 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-content/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.798609 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/extract-utilities/0.log" Dec 01 12:07:09 crc kubenswrapper[4909]: I1201 12:07:09.986828 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-cbcqx_dd102738-71ee-4009-8101-f645325f2de7/registry-server/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.096494 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-utilities/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.230964 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-utilities/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.238220 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-content/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.257242 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-content/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.409187 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-utilities/0.log" Dec 01 12:07:10 crc kubenswrapper[4909]: I1201 12:07:10.462305 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/extract-content/0.log" Dec 01 12:07:11 crc kubenswrapper[4909]: I1201 12:07:11.020957 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p4vb_88e81d71-dfc9-44cf-b226-133d99220628/registry-server/0.log" Dec 01 12:07:14 crc kubenswrapper[4909]: I1201 12:07:14.257548 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:07:14 crc kubenswrapper[4909]: E1201 12:07:14.259596 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:07:26 crc kubenswrapper[4909]: I1201 12:07:26.258092 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:07:26 crc kubenswrapper[4909]: E1201 12:07:26.258794 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:07:41 crc kubenswrapper[4909]: I1201 12:07:41.257466 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:07:41 crc kubenswrapper[4909]: E1201 12:07:41.258715 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:07:42 crc kubenswrapper[4909]: E1201 12:07:42.511565 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.72:35598->38.129.56.72:33131: write tcp 38.129.56.72:35598->38.129.56.72:33131: write: connection reset by peer Dec 01 12:07:56 crc kubenswrapper[4909]: I1201 12:07:56.258278 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:07:56 crc kubenswrapper[4909]: E1201 12:07:56.259203 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.080158 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:03 crc kubenswrapper[4909]: E1201 12:08:03.081600 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f321034-fbdd-4d53-b335-46d7f8976f2f" containerName="container-00" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.081621 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f321034-fbdd-4d53-b335-46d7f8976f2f" containerName="container-00" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.081839 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f321034-fbdd-4d53-b335-46d7f8976f2f" containerName="container-00" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.083665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.096651 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.148912 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrcd\" (UniqueName: \"kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.149230 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.149329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.253239 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrcd\" (UniqueName: \"kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.253338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.253368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.253947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.254050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.273990 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrcd\" (UniqueName: \"kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd\") pod \"community-operators-wcdnv\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.417242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:03 crc kubenswrapper[4909]: I1201 12:08:03.966995 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:04 crc kubenswrapper[4909]: I1201 12:08:04.401580 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerID="8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d" exitCode=0 Dec 01 12:08:04 crc kubenswrapper[4909]: I1201 12:08:04.401634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerDied","Data":"8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d"} Dec 01 12:08:04 crc kubenswrapper[4909]: I1201 12:08:04.401667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerStarted","Data":"3d29c1db4225b0085e58d838eb17d875744296e090dba354f7ab2510fb3904cf"} Dec 01 12:08:05 crc kubenswrapper[4909]: I1201 12:08:05.411844 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerStarted","Data":"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789"} Dec 01 12:08:06 crc kubenswrapper[4909]: I1201 12:08:06.421992 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerID="c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789" exitCode=0 Dec 01 12:08:06 crc kubenswrapper[4909]: I1201 12:08:06.422065 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerDied","Data":"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789"} Dec 01 12:08:07 crc kubenswrapper[4909]: I1201 12:08:07.433562 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerStarted","Data":"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695"} Dec 01 12:08:07 crc kubenswrapper[4909]: I1201 12:08:07.464339 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcdnv" podStartSLOduration=1.96038289 podStartE2EDuration="4.464319994s" podCreationTimestamp="2025-12-01 12:08:03 +0000 UTC" firstStartedPulling="2025-12-01 12:08:04.403623173 +0000 UTC m=+5801.638094071" lastFinishedPulling="2025-12-01 12:08:06.907560277 +0000 UTC m=+5804.142031175" observedRunningTime="2025-12-01 12:08:07.454749554 +0000 UTC m=+5804.689220472" watchObservedRunningTime="2025-12-01 12:08:07.464319994 +0000 UTC m=+5804.698790892" Dec 01 12:08:08 crc kubenswrapper[4909]: I1201 12:08:08.257730 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:08:08 crc kubenswrapper[4909]: E1201 12:08:08.258234 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:08:13 crc kubenswrapper[4909]: I1201 12:08:13.418121 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:13 crc kubenswrapper[4909]: I1201 12:08:13.419733 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:13 crc kubenswrapper[4909]: I1201 12:08:13.464994 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:13 crc kubenswrapper[4909]: I1201 12:08:13.543545 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:13 crc kubenswrapper[4909]: I1201 12:08:13.707974 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:15 crc kubenswrapper[4909]: I1201 12:08:15.496778 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wcdnv" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="registry-server" containerID="cri-o://91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695" gracePeriod=2 Dec 01 12:08:15 crc kubenswrapper[4909]: I1201 12:08:15.984502 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.108843 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content\") pod \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.109077 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssrcd\" (UniqueName: \"kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd\") pod \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.109204 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities\") pod \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\" (UID: \"1c3e7a58-aec3-403c-8c6a-5cbc9df94398\") " Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.109893 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities" (OuterVolumeSpecName: "utilities") pod "1c3e7a58-aec3-403c-8c6a-5cbc9df94398" (UID: "1c3e7a58-aec3-403c-8c6a-5cbc9df94398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.114152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd" (OuterVolumeSpecName: "kube-api-access-ssrcd") pod "1c3e7a58-aec3-403c-8c6a-5cbc9df94398" (UID: "1c3e7a58-aec3-403c-8c6a-5cbc9df94398"). InnerVolumeSpecName "kube-api-access-ssrcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.211071 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.211133 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssrcd\" (UniqueName: \"kubernetes.io/projected/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-kube-api-access-ssrcd\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.505659 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerID="91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695" exitCode=0 Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.505724 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdnv" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.505728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerDied","Data":"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695"} Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.505807 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdnv" event={"ID":"1c3e7a58-aec3-403c-8c6a-5cbc9df94398","Type":"ContainerDied","Data":"3d29c1db4225b0085e58d838eb17d875744296e090dba354f7ab2510fb3904cf"} Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.505827 4909 scope.go:117] "RemoveContainer" containerID="91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.525727 4909 scope.go:117] "RemoveContainer" containerID="c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.556825 4909 scope.go:117] "RemoveContainer" containerID="8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.593179 4909 scope.go:117] "RemoveContainer" containerID="91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695" Dec 01 12:08:16 crc kubenswrapper[4909]: E1201 12:08:16.593600 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695\": container with ID starting with 91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695 not found: ID does not exist" containerID="91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.593629 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695"} err="failed to get container status \"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695\": rpc error: code = NotFound desc = could not find container \"91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695\": container with ID starting with 91e424d314ca45700125127fd1090327540b32e608358d4c958f0d8e33ebe695 not found: ID does not exist" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.593653 4909 scope.go:117] "RemoveContainer" containerID="c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789" Dec 01 12:08:16 crc kubenswrapper[4909]: E1201 12:08:16.593975 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789\": container with ID starting with c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789 not found: ID does not exist" containerID="c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.594018 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789"} err="failed to get container status \"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789\": rpc error: code = NotFound desc = could not find container \"c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789\": container with ID starting with c5099a24c1900a9b882dad11d348955008255d32f68c58fe73ac1a95aa1f6789 not found: ID does not exist" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.594044 4909 scope.go:117] "RemoveContainer" containerID="8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d" Dec 01 12:08:16 crc kubenswrapper[4909]: E1201 12:08:16.594361 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d\": container with ID starting with 8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d not found: ID does not exist" containerID="8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d" Dec 01 12:08:16 crc kubenswrapper[4909]: I1201 12:08:16.594383 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d"} err="failed to get container status \"8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d\": rpc error: code = NotFound desc = could not find container \"8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d\": container with ID starting with 8bf75bf29c5bd99ff2ff99e1ee32fcd60ee996561c3ea84c9bd8309037e5e05d not found: ID does not exist" Dec 01 12:08:17 crc kubenswrapper[4909]: I1201 12:08:17.133738 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c3e7a58-aec3-403c-8c6a-5cbc9df94398" (UID: "1c3e7a58-aec3-403c-8c6a-5cbc9df94398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:17 crc kubenswrapper[4909]: I1201 12:08:17.231719 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c3e7a58-aec3-403c-8c6a-5cbc9df94398-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:17 crc kubenswrapper[4909]: I1201 12:08:17.429842 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:17 crc kubenswrapper[4909]: I1201 12:08:17.438712 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wcdnv"] Dec 01 12:08:19 crc kubenswrapper[4909]: I1201 12:08:19.267976 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" path="/var/lib/kubelet/pods/1c3e7a58-aec3-403c-8c6a-5cbc9df94398/volumes" Dec 01 12:08:22 crc kubenswrapper[4909]: I1201 12:08:22.257509 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:08:22 crc kubenswrapper[4909]: E1201 12:08:22.258341 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.905158 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:25 crc kubenswrapper[4909]: E1201 12:08:25.906146 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="extract-utilities" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.906167 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="extract-utilities" Dec 01 12:08:25 crc kubenswrapper[4909]: E1201 12:08:25.906189 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="extract-content" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.906197 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="extract-content" Dec 01 12:08:25 crc kubenswrapper[4909]: E1201 12:08:25.906209 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="registry-server" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.906215 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="registry-server" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.906438 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3e7a58-aec3-403c-8c6a-5cbc9df94398" containerName="registry-server" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.907813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:25 crc kubenswrapper[4909]: I1201 12:08:25.922163 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.020688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.020954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.021004 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pg8n\" (UniqueName: \"kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.123621 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.123733 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.123770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pg8n\" (UniqueName: \"kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.124149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.124421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.153173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pg8n\" (UniqueName: \"kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n\") pod \"redhat-operators-g8zrz\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.242751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:26 crc kubenswrapper[4909]: I1201 12:08:26.740025 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:26 crc kubenswrapper[4909]: W1201 12:08:26.756756 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128b3dc3_287a_43cf_a263_88b6c04ace73.slice/crio-33c9ea8cf431f5aea498549f30388b2cf62cef59480076fe81ab4dd8e56421a6 WatchSource:0}: Error finding container 33c9ea8cf431f5aea498549f30388b2cf62cef59480076fe81ab4dd8e56421a6: Status 404 returned error can't find the container with id 33c9ea8cf431f5aea498549f30388b2cf62cef59480076fe81ab4dd8e56421a6 Dec 01 12:08:27 crc kubenswrapper[4909]: I1201 12:08:27.633037 4909 generic.go:334] "Generic (PLEG): container finished" podID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerID="1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865" exitCode=0 Dec 01 12:08:27 crc kubenswrapper[4909]: I1201 12:08:27.633231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerDied","Data":"1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865"} Dec 01 12:08:27 crc kubenswrapper[4909]: I1201 12:08:27.633348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerStarted","Data":"33c9ea8cf431f5aea498549f30388b2cf62cef59480076fe81ab4dd8e56421a6"} Dec 01 12:08:30 crc kubenswrapper[4909]: I1201 12:08:30.665641 4909 generic.go:334] "Generic (PLEG): container finished" podID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerID="963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7" exitCode=0 Dec 01 12:08:30 crc kubenswrapper[4909]: I1201 12:08:30.666083 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerDied","Data":"963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7"} Dec 01 12:08:31 crc kubenswrapper[4909]: I1201 12:08:31.684381 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerStarted","Data":"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085"} Dec 01 12:08:31 crc kubenswrapper[4909]: I1201 12:08:31.721012 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g8zrz" podStartSLOduration=3.024115655 podStartE2EDuration="6.720990452s" podCreationTimestamp="2025-12-01 12:08:25 +0000 UTC" firstStartedPulling="2025-12-01 12:08:27.636660831 +0000 UTC m=+5824.871131729" lastFinishedPulling="2025-12-01 12:08:31.333535628 +0000 UTC m=+5828.568006526" observedRunningTime="2025-12-01 12:08:31.703274258 +0000 UTC m=+5828.937745166" watchObservedRunningTime="2025-12-01 12:08:31.720990452 +0000 UTC m=+5828.955461350" Dec 01 12:08:33 crc kubenswrapper[4909]: I1201 12:08:33.263901 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:08:33 crc kubenswrapper[4909]: E1201 12:08:33.264754 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:08:36 crc kubenswrapper[4909]: I1201 12:08:36.242947 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:36 crc kubenswrapper[4909]: I1201 12:08:36.245396 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:37 crc kubenswrapper[4909]: I1201 12:08:37.295477 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g8zrz" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="registry-server" probeResult="failure" output=< Dec 01 12:08:37 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Dec 01 12:08:37 crc kubenswrapper[4909]: > Dec 01 12:08:46 crc kubenswrapper[4909]: I1201 12:08:46.289275 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:46 crc kubenswrapper[4909]: I1201 12:08:46.347952 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:46 crc kubenswrapper[4909]: I1201 12:08:46.531283 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:47 crc kubenswrapper[4909]: I1201 12:08:47.257768 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:08:47 crc kubenswrapper[4909]: E1201 12:08:47.258035 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:08:47 crc kubenswrapper[4909]: I1201 12:08:47.836990 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g8zrz" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="registry-server" containerID="cri-o://446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085" gracePeriod=2 Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.329621 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.494933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pg8n\" (UniqueName: \"kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n\") pod \"128b3dc3-287a-43cf-a263-88b6c04ace73\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.495028 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities\") pod \"128b3dc3-287a-43cf-a263-88b6c04ace73\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.495050 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content\") pod \"128b3dc3-287a-43cf-a263-88b6c04ace73\" (UID: \"128b3dc3-287a-43cf-a263-88b6c04ace73\") " Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.496133 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities" (OuterVolumeSpecName: "utilities") pod "128b3dc3-287a-43cf-a263-88b6c04ace73" (UID: "128b3dc3-287a-43cf-a263-88b6c04ace73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.514381 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n" (OuterVolumeSpecName: "kube-api-access-4pg8n") pod "128b3dc3-287a-43cf-a263-88b6c04ace73" (UID: "128b3dc3-287a-43cf-a263-88b6c04ace73"). InnerVolumeSpecName "kube-api-access-4pg8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.597040 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pg8n\" (UniqueName: \"kubernetes.io/projected/128b3dc3-287a-43cf-a263-88b6c04ace73-kube-api-access-4pg8n\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.597088 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.611738 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128b3dc3-287a-43cf-a263-88b6c04ace73" (UID: "128b3dc3-287a-43cf-a263-88b6c04ace73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.698525 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128b3dc3-287a-43cf-a263-88b6c04ace73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.846061 4909 generic.go:334] "Generic (PLEG): container finished" podID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerID="446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085" exitCode=0 Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.846099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerDied","Data":"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085"} Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.846129 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g8zrz" event={"ID":"128b3dc3-287a-43cf-a263-88b6c04ace73","Type":"ContainerDied","Data":"33c9ea8cf431f5aea498549f30388b2cf62cef59480076fe81ab4dd8e56421a6"} Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.846172 4909 scope.go:117] "RemoveContainer" containerID="446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.847070 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g8zrz" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.868753 4909 scope.go:117] "RemoveContainer" containerID="963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.901037 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.908914 4909 scope.go:117] "RemoveContainer" containerID="1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.913835 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g8zrz"] Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.945527 4909 scope.go:117] "RemoveContainer" containerID="446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085" Dec 01 12:08:48 crc kubenswrapper[4909]: E1201 12:08:48.945909 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085\": container with ID starting with 446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085 not found: ID does not exist" containerID="446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.945944 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085"} err="failed to get container status \"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085\": rpc error: code = NotFound desc = could not find container \"446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085\": container with ID starting with 446687ceb6a38fefb1a26fe233faf663d70d2565d1c19ea491e96baa5ace0085 not found: ID does not exist" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.945966 4909 scope.go:117] "RemoveContainer" containerID="963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7" Dec 01 12:08:48 crc kubenswrapper[4909]: E1201 12:08:48.946582 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7\": container with ID starting with 963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7 not found: ID does not exist" containerID="963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.946664 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7"} err="failed to get container status \"963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7\": rpc error: code = NotFound desc = could not find container \"963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7\": container with ID starting with 963b39af13d3f43dd97c58f9bedd882860402a3dd778bfff045e5925c576e0f7 not found: ID does not exist" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.946697 4909 scope.go:117] "RemoveContainer" containerID="1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865" Dec 01 12:08:48 crc kubenswrapper[4909]: E1201 12:08:48.947110 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865\": container with ID starting with 1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865 not found: ID does not exist" containerID="1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865" Dec 01 12:08:48 crc kubenswrapper[4909]: I1201 12:08:48.947160 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865"} err="failed to get container status \"1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865\": rpc error: code = NotFound desc = could not find container \"1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865\": container with ID starting with 1b339ec255b2819715aec71705e436bf4b4355d5401df6a26989dc14e1530865 not found: ID does not exist" Dec 01 12:08:49 crc kubenswrapper[4909]: I1201 12:08:49.268630 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" path="/var/lib/kubelet/pods/128b3dc3-287a-43cf-a263-88b6c04ace73/volumes" Dec 01 12:08:52 crc kubenswrapper[4909]: I1201 12:08:52.887104 4909 generic.go:334] "Generic (PLEG): container finished" podID="219f1941-d19d-4d35-9783-49159faf5cf4" containerID="344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873" exitCode=0 Dec 01 12:08:52 crc kubenswrapper[4909]: I1201 12:08:52.887259 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" event={"ID":"219f1941-d19d-4d35-9783-49159faf5cf4","Type":"ContainerDied","Data":"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873"} Dec 01 12:08:52 crc kubenswrapper[4909]: I1201 12:08:52.889622 4909 scope.go:117] "RemoveContainer" containerID="344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873" Dec 01 12:08:53 crc kubenswrapper[4909]: I1201 12:08:53.520557 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2v9j_must-gather-zmzgc_219f1941-d19d-4d35-9783-49159faf5cf4/gather/0.log" Dec 01 12:08:55 crc kubenswrapper[4909]: E1201 12:08:55.514452 4909 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.72:44586->38.129.56.72:33131: write tcp 38.129.56.72:44586->38.129.56.72:33131: write: broken pipe Dec 01 12:09:00 crc kubenswrapper[4909]: I1201 12:09:00.257022 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:09:00 crc kubenswrapper[4909]: E1201 12:09:00.258016 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.270528 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r2v9j/must-gather-zmzgc"] Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.270767 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="copy" containerID="cri-o://787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4" gracePeriod=2 Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.273618 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r2v9j/must-gather-zmzgc"] Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.877983 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2v9j_must-gather-zmzgc_219f1941-d19d-4d35-9783-49159faf5cf4/copy/0.log" Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.878859 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.970519 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r2v9j_must-gather-zmzgc_219f1941-d19d-4d35-9783-49159faf5cf4/copy/0.log" Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.971761 4909 generic.go:334] "Generic (PLEG): container finished" podID="219f1941-d19d-4d35-9783-49159faf5cf4" containerID="787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4" exitCode=143 Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.971822 4909 scope.go:117] "RemoveContainer" containerID="787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4" Dec 01 12:09:01 crc kubenswrapper[4909]: I1201 12:09:01.971913 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r2v9j/must-gather-zmzgc" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.001248 4909 scope.go:117] "RemoveContainer" containerID="344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.052230 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output\") pod \"219f1941-d19d-4d35-9783-49159faf5cf4\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.052328 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74shm\" (UniqueName: \"kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm\") pod \"219f1941-d19d-4d35-9783-49159faf5cf4\" (UID: \"219f1941-d19d-4d35-9783-49159faf5cf4\") " Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.059980 4909 scope.go:117] "RemoveContainer" containerID="787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4" Dec 01 12:09:02 crc kubenswrapper[4909]: E1201 12:09:02.062401 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4\": container with ID starting with 787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4 not found: ID does not exist" containerID="787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.062528 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4"} err="failed to get container status \"787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4\": rpc error: code = NotFound desc = could not find container \"787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4\": container with ID starting with 787b422e6497301875faad24d946fcf152f575377ff8047e9b3eeab00e2550f4 not found: ID does not exist" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.062631 4909 scope.go:117] "RemoveContainer" containerID="344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.062557 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm" (OuterVolumeSpecName: "kube-api-access-74shm") pod "219f1941-d19d-4d35-9783-49159faf5cf4" (UID: "219f1941-d19d-4d35-9783-49159faf5cf4"). InnerVolumeSpecName "kube-api-access-74shm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:09:02 crc kubenswrapper[4909]: E1201 12:09:02.063238 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873\": container with ID starting with 344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873 not found: ID does not exist" containerID="344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.063261 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873"} err="failed to get container status \"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873\": rpc error: code = NotFound desc = could not find container \"344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873\": container with ID starting with 344bb44bf6d8a018c1939f097e23d5477ad6b1c110a03e01b0f2e598c6609873 not found: ID does not exist" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.155264 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74shm\" (UniqueName: \"kubernetes.io/projected/219f1941-d19d-4d35-9783-49159faf5cf4-kube-api-access-74shm\") on node \"crc\" DevicePath \"\"" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.201577 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "219f1941-d19d-4d35-9783-49159faf5cf4" (UID: "219f1941-d19d-4d35-9783-49159faf5cf4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:09:02 crc kubenswrapper[4909]: I1201 12:09:02.256889 4909 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/219f1941-d19d-4d35-9783-49159faf5cf4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 12:09:03 crc kubenswrapper[4909]: I1201 12:09:03.268809 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" path="/var/lib/kubelet/pods/219f1941-d19d-4d35-9783-49159faf5cf4/volumes" Dec 01 12:09:12 crc kubenswrapper[4909]: I1201 12:09:12.257973 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:09:12 crc kubenswrapper[4909]: E1201 12:09:12.258813 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:09:27 crc kubenswrapper[4909]: I1201 12:09:27.257589 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:09:27 crc kubenswrapper[4909]: E1201 12:09:27.258372 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:09:38 crc kubenswrapper[4909]: I1201 12:09:38.260516 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:09:38 crc kubenswrapper[4909]: E1201 12:09:38.261513 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:09:51 crc kubenswrapper[4909]: I1201 12:09:51.257641 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:09:51 crc kubenswrapper[4909]: E1201 12:09:51.258507 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:10:06 crc kubenswrapper[4909]: I1201 12:10:06.257905 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:10:06 crc kubenswrapper[4909]: E1201 12:10:06.259065 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:10:18 crc kubenswrapper[4909]: I1201 12:10:18.258034 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:10:18 crc kubenswrapper[4909]: E1201 12:10:18.261320 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:10:32 crc kubenswrapper[4909]: I1201 12:10:32.257977 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:10:32 crc kubenswrapper[4909]: E1201 12:10:32.258833 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4pcf2_openshift-machine-config-operator(672850e4-d044-44cc-b8a2-517dc1a285be)\"" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" Dec 01 12:10:46 crc kubenswrapper[4909]: I1201 12:10:46.257144 4909 scope.go:117] "RemoveContainer" containerID="1db504c8c4670861e4056684a5734a62f6336f5db1d19aca94dcf06330e620c3" Dec 01 12:10:46 crc kubenswrapper[4909]: I1201 12:10:46.901754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" event={"ID":"672850e4-d044-44cc-b8a2-517dc1a285be","Type":"ContainerStarted","Data":"52f5018d001903f17a55fab559d1e565cbdf8646c3c20bfa3db33a0bac5e5cb5"} Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.552660 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:24 crc kubenswrapper[4909]: E1201 12:12:24.554142 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="copy" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554164 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="copy" Dec 01 12:12:24 crc kubenswrapper[4909]: E1201 12:12:24.554190 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="gather" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554199 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="gather" Dec 01 12:12:24 crc kubenswrapper[4909]: E1201 12:12:24.554234 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="extract-utilities" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554246 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="extract-utilities" Dec 01 12:12:24 crc kubenswrapper[4909]: E1201 12:12:24.554264 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="registry-server" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554272 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="registry-server" Dec 01 12:12:24 crc kubenswrapper[4909]: E1201 12:12:24.554288 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="extract-content" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554296 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="extract-content" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554634 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="gather" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554655 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="219f1941-d19d-4d35-9783-49159faf5cf4" containerName="copy" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.554689 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="128b3dc3-287a-43cf-a263-88b6c04ace73" containerName="registry-server" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.556813 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.582942 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.671891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.671977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47q8w\" (UniqueName: \"kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.672031 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.774853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.775027 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47q8w\" (UniqueName: \"kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.775080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.775695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.776272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.808030 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47q8w\" (UniqueName: \"kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w\") pod \"certified-operators-j24xc\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:24 crc kubenswrapper[4909]: I1201 12:12:24.885466 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:25 crc kubenswrapper[4909]: I1201 12:12:25.456401 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:26 crc kubenswrapper[4909]: I1201 12:12:26.076725 4909 generic.go:334] "Generic (PLEG): container finished" podID="baa75d32-df48-434c-889a-6a4687bba1b9" containerID="960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7" exitCode=0 Dec 01 12:12:26 crc kubenswrapper[4909]: I1201 12:12:26.076778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerDied","Data":"960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7"} Dec 01 12:12:26 crc kubenswrapper[4909]: I1201 12:12:26.076807 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerStarted","Data":"d9f1092b4f8bf14ee378387be825ba0342d1ea61996bb14a4612a626281c34e0"} Dec 01 12:12:26 crc kubenswrapper[4909]: I1201 12:12:26.079367 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.742795 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.746155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.771842 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.832589 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.832649 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.833024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8kkw\" (UniqueName: \"kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.934616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8kkw\" (UniqueName: \"kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.934700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.934746 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.935249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.935307 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:27 crc kubenswrapper[4909]: I1201 12:12:27.958757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8kkw\" (UniqueName: \"kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw\") pod \"redhat-marketplace-w7nq4\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:28 crc kubenswrapper[4909]: I1201 12:12:28.096393 4909 generic.go:334] "Generic (PLEG): container finished" podID="baa75d32-df48-434c-889a-6a4687bba1b9" containerID="c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197" exitCode=0 Dec 01 12:12:28 crc kubenswrapper[4909]: I1201 12:12:28.096442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerDied","Data":"c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197"} Dec 01 12:12:28 crc kubenswrapper[4909]: I1201 12:12:28.136370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:28 crc kubenswrapper[4909]: I1201 12:12:28.588182 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:28 crc kubenswrapper[4909]: W1201 12:12:28.591026 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7e023e_17f8_4614_8bea_ea1eb03fa3b6.slice/crio-5cd8cc11f86016dcb5aefe5a28ab7731c271971bc0c72159bdb59d2e2755da5d WatchSource:0}: Error finding container 5cd8cc11f86016dcb5aefe5a28ab7731c271971bc0c72159bdb59d2e2755da5d: Status 404 returned error can't find the container with id 5cd8cc11f86016dcb5aefe5a28ab7731c271971bc0c72159bdb59d2e2755da5d Dec 01 12:12:29 crc kubenswrapper[4909]: I1201 12:12:29.105175 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" containerID="a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1" exitCode=0 Dec 01 12:12:29 crc kubenswrapper[4909]: I1201 12:12:29.105241 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerDied","Data":"a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1"} Dec 01 12:12:29 crc kubenswrapper[4909]: I1201 12:12:29.105270 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerStarted","Data":"5cd8cc11f86016dcb5aefe5a28ab7731c271971bc0c72159bdb59d2e2755da5d"} Dec 01 12:12:29 crc kubenswrapper[4909]: I1201 12:12:29.108570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerStarted","Data":"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2"} Dec 01 12:12:29 crc kubenswrapper[4909]: I1201 12:12:29.152162 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j24xc" podStartSLOduration=2.448385826 podStartE2EDuration="5.152142688s" podCreationTimestamp="2025-12-01 12:12:24 +0000 UTC" firstStartedPulling="2025-12-01 12:12:26.079050968 +0000 UTC m=+6063.313521876" lastFinishedPulling="2025-12-01 12:12:28.78280784 +0000 UTC m=+6066.017278738" observedRunningTime="2025-12-01 12:12:29.144375604 +0000 UTC m=+6066.378846502" watchObservedRunningTime="2025-12-01 12:12:29.152142688 +0000 UTC m=+6066.386613596" Dec 01 12:12:31 crc kubenswrapper[4909]: I1201 12:12:31.129379 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" containerID="d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4" exitCode=0 Dec 01 12:12:31 crc kubenswrapper[4909]: I1201 12:12:31.131059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerDied","Data":"d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4"} Dec 01 12:12:32 crc kubenswrapper[4909]: I1201 12:12:32.138099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerStarted","Data":"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf"} Dec 01 12:12:32 crc kubenswrapper[4909]: I1201 12:12:32.162318 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7nq4" podStartSLOduration=2.590182767 podStartE2EDuration="5.162299335s" podCreationTimestamp="2025-12-01 12:12:27 +0000 UTC" firstStartedPulling="2025-12-01 12:12:29.10750019 +0000 UTC m=+6066.341971088" lastFinishedPulling="2025-12-01 12:12:31.679616758 +0000 UTC m=+6068.914087656" observedRunningTime="2025-12-01 12:12:32.160361074 +0000 UTC m=+6069.394831982" watchObservedRunningTime="2025-12-01 12:12:32.162299335 +0000 UTC m=+6069.396770233" Dec 01 12:12:34 crc kubenswrapper[4909]: I1201 12:12:34.886824 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:34 crc kubenswrapper[4909]: I1201 12:12:34.887418 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:34 crc kubenswrapper[4909]: I1201 12:12:34.960385 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:35 crc kubenswrapper[4909]: I1201 12:12:35.232254 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:35 crc kubenswrapper[4909]: I1201 12:12:35.537751 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.188840 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j24xc" podUID="baa75d32-df48-434c-889a-6a4687bba1b9" containerName="registry-server" containerID="cri-o://926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2" gracePeriod=2 Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.646438 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.757317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content\") pod \"baa75d32-df48-434c-889a-6a4687bba1b9\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.757473 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities\") pod \"baa75d32-df48-434c-889a-6a4687bba1b9\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.757513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47q8w\" (UniqueName: \"kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w\") pod \"baa75d32-df48-434c-889a-6a4687bba1b9\" (UID: \"baa75d32-df48-434c-889a-6a4687bba1b9\") " Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.758365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities" (OuterVolumeSpecName: "utilities") pod "baa75d32-df48-434c-889a-6a4687bba1b9" (UID: "baa75d32-df48-434c-889a-6a4687bba1b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.758655 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.764560 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w" (OuterVolumeSpecName: "kube-api-access-47q8w") pod "baa75d32-df48-434c-889a-6a4687bba1b9" (UID: "baa75d32-df48-434c-889a-6a4687bba1b9"). InnerVolumeSpecName "kube-api-access-47q8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:12:37 crc kubenswrapper[4909]: I1201 12:12:37.861298 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47q8w\" (UniqueName: \"kubernetes.io/projected/baa75d32-df48-434c-889a-6a4687bba1b9-kube-api-access-47q8w\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.065429 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "baa75d32-df48-434c-889a-6a4687bba1b9" (UID: "baa75d32-df48-434c-889a-6a4687bba1b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.066042 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baa75d32-df48-434c-889a-6a4687bba1b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.137470 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.137607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.189444 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.231949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerDied","Data":"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2"} Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.232633 4909 scope.go:117] "RemoveContainer" containerID="926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.231799 4909 generic.go:334] "Generic (PLEG): container finished" podID="baa75d32-df48-434c-889a-6a4687bba1b9" containerID="926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2" exitCode=0 Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.232574 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j24xc" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.233260 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j24xc" event={"ID":"baa75d32-df48-434c-889a-6a4687bba1b9","Type":"ContainerDied","Data":"d9f1092b4f8bf14ee378387be825ba0342d1ea61996bb14a4612a626281c34e0"} Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.279376 4909 scope.go:117] "RemoveContainer" containerID="c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.287149 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.298940 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j24xc"] Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.303679 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.317945 4909 scope.go:117] "RemoveContainer" containerID="960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.354459 4909 scope.go:117] "RemoveContainer" containerID="926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2" Dec 01 12:12:38 crc kubenswrapper[4909]: E1201 12:12:38.355108 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2\": container with ID starting with 926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2 not found: ID does not exist" containerID="926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.355169 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2"} err="failed to get container status \"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2\": rpc error: code = NotFound desc = could not find container \"926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2\": container with ID starting with 926067999ddaefa76512c6c4f73340566fa5c8e3cabb2d6eca413da1dbc560b2 not found: ID does not exist" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.355205 4909 scope.go:117] "RemoveContainer" containerID="c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197" Dec 01 12:12:38 crc kubenswrapper[4909]: E1201 12:12:38.355648 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197\": container with ID starting with c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197 not found: ID does not exist" containerID="c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.355717 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197"} err="failed to get container status \"c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197\": rpc error: code = NotFound desc = could not find container \"c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197\": container with ID starting with c61c9bb1886c6d1dc2bda17fb02a4db387a35a6c690b87ba115f64dcc93a9197 not found: ID does not exist" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.355766 4909 scope.go:117] "RemoveContainer" containerID="960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7" Dec 01 12:12:38 crc kubenswrapper[4909]: E1201 12:12:38.356186 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7\": container with ID starting with 960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7 not found: ID does not exist" containerID="960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7" Dec 01 12:12:38 crc kubenswrapper[4909]: I1201 12:12:38.356222 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7"} err="failed to get container status \"960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7\": rpc error: code = NotFound desc = could not find container \"960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7\": container with ID starting with 960d57fe535dec61b95a39b3ba8327c8c40c5d5507c9c5be0d38233202af55a7 not found: ID does not exist" Dec 01 12:12:39 crc kubenswrapper[4909]: I1201 12:12:39.274695 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa75d32-df48-434c-889a-6a4687bba1b9" path="/var/lib/kubelet/pods/baa75d32-df48-434c-889a-6a4687bba1b9/volumes" Dec 01 12:12:40 crc kubenswrapper[4909]: I1201 12:12:40.533531 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.271126 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7nq4" podUID="5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" containerName="registry-server" containerID="cri-o://4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf" gracePeriod=2 Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.769914 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.851500 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8kkw\" (UniqueName: \"kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw\") pod \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.851575 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities\") pod \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.851601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content\") pod \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\" (UID: \"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6\") " Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.861009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities" (OuterVolumeSpecName: "utilities") pod "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" (UID: "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.865868 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw" (OuterVolumeSpecName: "kube-api-access-p8kkw") pod "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" (UID: "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6"). InnerVolumeSpecName "kube-api-access-p8kkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.873189 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" (UID: "5d7e023e-17f8-4614-8bea-ea1eb03fa3b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.953597 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8kkw\" (UniqueName: \"kubernetes.io/projected/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-kube-api-access-p8kkw\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.953638 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:41 crc kubenswrapper[4909]: I1201 12:12:41.953652 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.281149 4909 generic.go:334] "Generic (PLEG): container finished" podID="5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" containerID="4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf" exitCode=0 Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.281191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerDied","Data":"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf"} Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.281219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7nq4" event={"ID":"5d7e023e-17f8-4614-8bea-ea1eb03fa3b6","Type":"ContainerDied","Data":"5cd8cc11f86016dcb5aefe5a28ab7731c271971bc0c72159bdb59d2e2755da5d"} Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.281237 4909 scope.go:117] "RemoveContainer" containerID="4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.281350 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7nq4" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.305631 4909 scope.go:117] "RemoveContainer" containerID="d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.336055 4909 scope.go:117] "RemoveContainer" containerID="a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.345919 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.352343 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7nq4"] Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.387778 4909 scope.go:117] "RemoveContainer" containerID="4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf" Dec 01 12:12:42 crc kubenswrapper[4909]: E1201 12:12:42.388398 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf\": container with ID starting with 4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf not found: ID does not exist" containerID="4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.388545 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf"} err="failed to get container status \"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf\": rpc error: code = NotFound desc = could not find container \"4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf\": container with ID starting with 4b5647c70d59251d3e52467ecc1e97664fddee93d20487408aa096392c1f35bf not found: ID does not exist" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.388683 4909 scope.go:117] "RemoveContainer" containerID="d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4" Dec 01 12:12:42 crc kubenswrapper[4909]: E1201 12:12:42.389218 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4\": container with ID starting with d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4 not found: ID does not exist" containerID="d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.389259 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4"} err="failed to get container status \"d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4\": rpc error: code = NotFound desc = could not find container \"d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4\": container with ID starting with d3dd9f411086d3d672f9b6520fd19cb1a7f7e67725a6ac6bd0e4df517a919cd4 not found: ID does not exist" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.389289 4909 scope.go:117] "RemoveContainer" containerID="a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1" Dec 01 12:12:42 crc kubenswrapper[4909]: E1201 12:12:42.389517 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1\": container with ID starting with a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1 not found: ID does not exist" containerID="a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1" Dec 01 12:12:42 crc kubenswrapper[4909]: I1201 12:12:42.389538 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1"} err="failed to get container status \"a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1\": rpc error: code = NotFound desc = could not find container \"a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1\": container with ID starting with a5c4239685991c437ef69c67cbdbd0d4c815b538d855d7134cdbaa285a2e1df1 not found: ID does not exist" Dec 01 12:12:43 crc kubenswrapper[4909]: I1201 12:12:43.269681 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7e023e-17f8-4614-8bea-ea1eb03fa3b6" path="/var/lib/kubelet/pods/5d7e023e-17f8-4614-8bea-ea1eb03fa3b6/volumes" Dec 01 12:13:06 crc kubenswrapper[4909]: I1201 12:13:06.193255 4909 patch_prober.go:28] interesting pod/machine-config-daemon-4pcf2 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:13:06 crc kubenswrapper[4909]: I1201 12:13:06.193801 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4pcf2" podUID="672850e4-d044-44cc-b8a2-517dc1a285be" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"